explicitClick to confirm you are 18+

Tech and how it cuts off pathways to developing emotional intelligence.

tothetOct 12, 2019, 3:50:21 PM
thumb_up39thumb_downmore_vert

Emotional intelligence starts with the ability to manage own emotions and then extends to being able to sense and respond to emotions of others and concludes with ability to manage emotions of groups. Many tech products are designed to cater to our emotions without our active participation. I would argue that specifically because of their predictive nature, tech products do not allow us to process our emotions in a healthy way, leading to constant consumption and dissatisfaction. Let’s take a look at two key features which have become commonplace and how they eliminate the opportunity to develop emotional intelligence.

1. Predictive text

Predictive text autofills a world as soon as a user types it. It is different from spell check where the user is given spelling options and has to select one to continue their sentence. Predictive text comes as a default on many texting applications and has been incorporated into video messaging apps like Skype and email, like Gmail. In the world overflowing with emails, messages, and other demands for our attention, these tools have been marketed to us as time-savers. However, they often lead to communication which is reactive, generic, and lacks personality; therefore, never fully addressing the issue and resulting in more communication.

Predictive text doesn’t give the opportunity to exercise emotional intelligence because it eliminates the need to formulate our thoughts. All we are required to do is to press Enter-confirm or deny the prediction. Predictive text, while unconscious, is a reduction to a yes or a no at every opportunity to communicate what we are truly feeling.

2. Search suggestions

Search suggestions are based on the user’s browsing history. YouTube is a perfect example. As it is owned by Google, it gathers not only your YouTube search history but your Google searches and information you input into other Google applications, like your location data, and your contacts. Based on a myriad of variables that are derived from your activity and activity of others, Youtube is able to identify patterns and make suggestions for what you would like to watch in the future. As I noticed myself slipping into binge watching YouTube, I started replacing its entertainment with yoga videos, binaural beats, and a weekly episode of John Oliver, which sent my Youtube back to its early beginnings: suggesting similar videos from same creators because I watched these channels. With too few data points, Youtube was reduced to cycling through the same suggestions. I got my life back.

Search suggestions don’t give the opportunity to exercise emotional intelligence by eliminating the decision. By showing what seems like a logical pathway, companies like YouTube entrap their users in the constant stream of consuming. Emotional needs take a back seat in the system made to constantly entertain rather than to satisfy.

Because predictive text and search suggestion features do the emotional labor for us based on our past behavior, this doesn’t leave the opportunity to process how we are feeling and put together a response. The output is patterns. Predictable patterns rather than complex human beings are easier to analyze, classify, and target. In the internet world ruled by ads competing for attention, retaining one’s ability to make own decisions is a rebellious act.