No UI is the New UI thanks to Chatbots and Brain-Computer Interfaces, by Tony Aube, Medium
Masha Yudin stashed this in Innovations
Deep learning is a complete game changer. It allowed AI to reach new heights previously thought to be decades away. Nowadays, computers can hear, see, read and understand humans better than ever before. This is opening a world of opportunities for AI-powered apps, toward which entrepreneurs are rushing.
In this gold rush, messaging is the low-hanging fruit. This is because, out of all the possible forms of input, digital text is the most direct one. Text is constant, it doesn’t carry all the ambiguous information that other forms of communication do, such as voice or gestures. Furthermore, messaging makes for a better user experience than traditional apps because it feels natural and familiar. When messaging becomes the UI, you don’t need to deal with a constant stream of new interfaces all filled with different menus, buttons and labels. This explains the current rise in popularity of invisible and conversational apps, but the reason you should care about them goes beyond that.
By the way, not just Chatbots but also Brain interfaces:
The first video showcases project Soli, a small Radar chip created by Google to allow fine gesture recognition. The second one presents Emotiv, a product that can read your brainwaves and understand their meaning through — bear with me — electroencephalography (or EEG for short). While both technologies seem completely magical, they are not. They are currently functional and have something very special in common: they don’t require a UI for computer input.
As a designer, this is an unsettling trend to internalize. In a world where computer can see, listen, talk, understand and reply to you, what is the purpose of a user interface? Why bother designing an app to manage your bank account when you could just talk to it directly? Beyond human-interface interaction, we are entering the world of Brain-Computer Interaction. In this world, digital-telepathy coupled with AI and other means of input could allow us to communicate directly with computer, without the need for a screen.