By 2030, information and communication technologies will represent 21% of global energy needs. The environmental costs of the models being built constitute a major challenge of the application of machine learning to natural language. Mainstream approaches focus on very large models that manipulate language by means of statistics and require huge amounts of training data and computing power to get a superficial understanding of language.
In his talk at AI Global Forum 2020, Francisco Webber, co-founder and CEO of Cortical.io, argues that the future of natural language understanding is to take an efficient approach towards AI leveraging what biology teaches us. By replicating the actual cognitive processes in the brain, Semantic Folding proposes a highly efficient NLU model that understands semantics at a fine-grained level.
Key takeaways:
- Why High Efficiency AI is the path for the future
- Which AI approach delivers orders-of-magnitude improvements in efficiency
- Why one should teach semantics rather than training data-intensive statistical models