Over the past few years, scientists, pundits, and armchair psychologists have started questioning technology's effects on our humanity.
Our fascination with social media (say, using Tumblr for six hours per day), our reliance on GPS to find an urban destination, or even a simple Google search as a replacement for remembering the capital of Nebraska, could be transforming us.
Most technologists reckon we're changing for the better. Our gadgets and gizmos are helping us to connect more with each other, stay alert when we drive, and discover more information.
But a few researchers suggest we are changing for the worse. No, they're not saying that 'the sky is falling' and we ought to panic, but they are worried about our digital transformation. And, they say, this potential dehumanisation might not happen for another 100 years or more.
Here's a new term to consider: sensory dynamism. The concept has to do with our perception. When you look out of a window, you perceive millions of variances - colour, perspective, sound, feeling, and many others. But when you gaze at an iPad, you're sensing just a few variables - and with email and SMS, you may barely be using your senses. That could pose a problem in the long run for future human development.
Neema Moraveji is the director of the Calming Technology Lab at Stanford University. He says sensory dynamism can be a problem when it comes to an over-reliance on computer technology. (To address the concern, his team is working on adding more sensory stimulus to gadgets, computer screens, and other devices.)
Moraveji says technology can sometimes cloud our sensory judgement. We see only factual and textual information instead of an array of human emotions.
"Technology makes us less human when we believe life is a rat race to be won - a zero-sum mentality - and when we are isolated and individual rather than interconnected, and primarily competitive rather than primarily collaborative," he says.
"I describe the brain as an organ whose job it is to learn through its physiochemical and cognitive senses. Without sufficient dynamism, the brain becomes focused on particular senses and inputs that are not representative of the natural world."
Ironically, one of the answers may lie in videogame technology. More than the flat graphics of a phone displaying text, games at least mimic the sensations of sound, light, and emotion in a more realistic virtual world. Game technology is also advancing - some day, we might 'smell' a rainforest or 'touch' an alien skin.
Strictly speaking, implantable electronics make us less human: we become, in some percentage, machine. Of course, the first cardiac pacemakers were invented back in the 50s - saying someone is 'less human' if they have a pacemaker is a bit harsh.
Yet some of us might have an implant to enhance vision or read text messages directly into the synapse, or might use a bio-skeleton for enhanced strength. In 100 years, embedded technology could replace more and more of our human anatomy.
Dr Bridget Duffy is the chief medical officer at Vocera, a company that makes a wireless communicator for use in hospitals. She talks of an '80-20' rule in the health profession. In some cases, only 20% of healing occurs because of a drug treatment or surgery, while 80% of the success depends on patient-doctor interaction. If a 'human being' transforms into something that's more electronic than biological, there is a concern that a future society will lose the distinctions of emotional connection.
"There is something about hope, communication, and trust that improves the outcome," Duffy says. "You can focus on a good technical outcome, but there has to be the other component. When you know a loved one who has faced mortality and a life-threatening illness, the implant is not enough - there is something about physical contact."
Duffy explains that in many surgery rooms, it's not uncommon for the entire staff to touch and speak directly to the patient. But it's already possible, she says, for a doctor to perform a procedure entirely from "behind the glass" without ever meeting a patient, robotically controlling all of the instruments.
Following this path, could a future total reliance on medical technology make us less human? Patients might even, for instance, be able perform home surgeries, but the 'less human' argument hints that this could result in fewer successful surgeries and affect our long-term health as a society. Duffy says the 80-20 rule might even be applied across all technology - we should have real human contact 80% of the time and restrict virtual experience to the remaining 20%.
Search has put a world of information at our fingertips. We can search for information about the latest Syrian army attacks, or find out about Himalayan fruit flies. In 2010, however, Nicholas Carr wrote a seminal book on whether search is making us stupid. The Shallows: What the Internet Is Doing to Our Brains recounts how our search dependence could have ill effects in society when we lose our ability to self-reason.
Search tech has evolved dramatically over the last 15 years - no one knows the role it may have in our lives in another 50 or 100. Yet even Matt Wallaert, a behavioural scientist at Bing, questions whether it is good to become wholly dependent on search. He says researchers suspect the human brain needs serendipitous discovery. There's a famous example of this. Look closely at this image until you see the 'hidden' object. Wallaert says our brains receive a pleasure response from resolving the puzzle.
"When you search for 'when was George Harrison born' does that prevent us from looking into our brain and realising the answer?" asks Wallaert, somewhat rhetorically. "When we scratch out that act, does it deprive us of that small burst of pleasure?"
The question is whether a greater and greater dependence on search means we are changing for the worse. Some search is good; all search could be detrimental.
Of course, there are a counter-arguments. After all, when we search for facts on Bing or Google, we are gaining knowledge and, potentially, increasing our intelligence. Wallaert, for one, isn't concerned about the short-term implications, and no expert we ask suggests we should not use these tools. What is disconcerting, though, is the idea that in some far-distant society we may not retain as much tacit knowledge, relying instead on what computers tell us to be true.