
In the mid-20th century, the world entered the Information Age—a phase of comprehensive transformation from industry to information technology.
This era began with the miniaturization of computers and culminated in the invention of the World Wide Web, making information accessible to almost everyone.
Now, with the rise of artificial intelligence (AI), some technology leaders believe this era has ended, and a completely new technological era is dawning.
“We have moved from the Information Age to the Intelligent Age,” said Prakhar Mehrotra, Senior Vice President of AI at PayPal, at an AI summit earlier this month.
Mehrotra explained that the hallmark of this “Intelligent Age” is that industries are moving away from traditional data storage and retrieval models. With AI, data is generated more proactively, with the ultimate goal of achieving partial autonomy in the workplace.
Currently, companies worldwide are racing to apply AI to their workflows, hoping it will improve productivity and efficiency, but with mixed results. A study released by MIT in August showed that 95% of enterprise-level AI-powered office initiatives failed to achieve rapid revenue growth.
“This will be a long journey…it has to go through stages of ‘crawling, walking, running,’” Mehrotra pointed out. “That statement was true ten years ago, and it still applies today.”
Marc Hamilton, Vice President of Solutions Architecture and Engineering at NVIDIA, who was interviewed alongside Mehrotra, stated that the key to building AI systems for enterprises in the future lies in investing in “AI factories”—whether deployed on-premises or running in the cloud. This is because the data supporting enterprise operations in the future will no longer primarily rely on human or traditional computer retrieval, but will be generated by artificial intelligence.
“When you ask, ‘Generate PowerPoint slides containing specific content,’ or ask, ‘I’m writing this function, can you generate the corresponding code?’—this isn’t retrieving information from a database, but rather calling a model to dynamically generate data,” Hamilton explained.
Tokens Become Key
Mehrotra pointed out that for enterprises to effectively build the computing power needed to generate this type of data, they must focus on a new basic unit: Tokens. As the basic unit for text AI to understand and process language, Tokens are both information fragments in training data and output content generated by the model after receiving instructions.
“Every company must re-examine their data from a token perspective, because only then can intelligence be extracted,” Mehrotra emphasized.
As a metric for measuring input and output, token generation has become a key performance indicator for tech companies. In May, Nvidia revealed that its chip customer Microsoft generated over 100 trillion tokens in the first quarter, a fivefold year-over-year increase. This kind of output data helps AI companies demonstrate progress to investors and boost valuations, although actual data shows that the correlation between tokens and market demand and profits is not as strong as tech companies suggest.
Both Mehrotra and Hamilton believe that many companies now recognize the importance of tokens in enhancing AI capabilities, but are still exploring how to best integrate them into their own needs—every company already possesses some form of “AI factory,” capable of both receiving and producing tokens with commercial value.
“I see it as a kind of ‘muscle training,’” Mehrotra said. “If all employees can think and work from the perspective of tokens and the generation process, then this will become a completely different company.” (Source: CLS)