Google LLC’s cloud business will help Cohere Inc., an early-stage artificial intelligence startup, run natural language processing models in the cloud as part of a multiyear partnership announced today.
The partnership has multiple elements. First, Cohere will use Google’s Cloud TPU chips, specialized AI processors that the search giant offers via its public cloud, to develop and build “many” of its planned machine learning products. Additionally, Google Cloud Chief Executive Officer Thomas Kurian told TechCrunch that the companies intend to launch a joint go-to-market initiative.
“Leading companies around the world are using AI to fundamentally transform their business processes and deliver more helpful customer experiences,” Kurian said in a statement. “Our work with Cohere will make it easier and more cost-effective for any organization to realize the possibilities of AI with powerful NLP services powered by Google’s custom-designed Tensor Processing Units.”
Toronto-based Cohere is developing an AI platform that enables organizations to incorporate natural language processing features into their applications. The startup is led by co-founder and Chief Executive Officer Aidan Gomez, a former researcher at Google Brain, the search giant’s AI research group.
Gomez co-authored the landmark 2017 academic paper that introduced the concept of a Transformer model. A Transformer model is a type of neural network that has become the go-to choice for many important AI use cases, particularly in the field of natural language processing, Cohere’s focus area.
The main feature that sets Transformer models apart from other neural networks is their use of a so-called attention mechanism. The attention mechanism makes it possible to process text more efficiently than earlier technologies.
To correctly interpret a word in a sentence, a natural language processing model must take into account the word’s context. For example, the word “bank” can mean either a financial institution or the bank of a river depending on the context. Transformer models’ attention mechanism allows them to identify which parts of a sentence influence the meaning of a word most directly. By prioritizing the most relevant context, Transformer models can produce highly accurate results.
Cohere’s first product is a cloud-based natural language platform that can automatically generate text in response to prompts. The platform is accessible through an application programming interface. Though Cohere only exited stealth mode in May, the startup’s technology has already helped it raise a $40 million funding round led by prominent venture capital firm Index Ventures.
The Cloud TPU chips in Google Cloud that Cohere plans to use to build and run future products were originally developed by the search for internal use. Google designed the chips for the specific purpose of running AI models.
Each TPU includes thousands of highly specialized processing modules known as multiply-accumulators. The modules are optimized to perform matrix multiplications, the mathematical operations that neural networks use to extract insights from the data they process.
Google enables cloud customers to access TPUs in several ways. Organizations can rent individual TPUs or reserve capacity in a TPU Pod, a cluster of chips connected by a high-speed network. Google says that a single TPU Pod provides up to 100 petaflops of performance, which is equivalent to 1 quadrillion computations per second.