• Transformers are the basis for the viral chatbot ChatGPT and the current generative AI competition.
  • Whether AI can continue to rapidly improve performance by injecting more computing power and training data is hotly debated.

Artificial Intelligence startup Symbolica secures £24.5 million Series A funding, its CEO Morgan cites co-authored paper with DeepMind, industry debate over AI performance gains Symbolica’s first product is a coding assistant, due in early 2025.

Access to financing

Artificial intelligence startup Symbolica said General Catalyst, Abstract Ventures, and Buckley Ventures also participated in the round.

Symbolica CEO George Morgan, a former Tesla’s Autopilot Systems division employee, criticised a paper co-authored with Google’s AI subsidiary DeepMind, which emphasised that Transformers are not the essence of AI but rather hackers hacking on hackers, highlighting the current generative AI race.

Also read:OpenAI worried its new voice tool will be used for scams

Also read:OpenAI voice-clone tool mimics your voice with 15-second sample

Improved performance of AI

The potential of AI to maintain its performance growth through increased compute power and training data is a contentious issue. Companies like OpenAI are focused on finding more compute, while others like Symbolica believe different foundation model architectures can yield better results than scaling transformers. The industry is grappling with the cost and reliability of transformers.

The initial product from Symbolica will be a coding assistant, but Morgan stated that it won’t be available until early 2025 since the business has to employ and train a model.