Core42 and Qualcomm Launch Compass 2.0 to Boost AI Efficiency and Accessibility

The platform integrates Qualcomm's inference-as-a-service, offering support for pre-trained models and scalable solutions within fully containerized environments.

Reading Time: 2 min  

Topics

  • [Image source: Krishna Prasad/MITSMR Middle East]

    The upgraded platform debuted earlier this year and has already processed 150 billion tokens.

    Powered by Qualcomm Cloud AI 100 Ultra inference accelerators, Compass 2.0 aims to set new benchmarks in energy efficiency, flexibility, and cost-effectiveness for enterprises leveraging AI workloads.

    The platform integrates Qualcomm’s inference-as-a-service, offering support for pre-trained models and scalable solutions within fully containerized environments.

    Designed to simplify AI model utilization, Compass 2.0 expands its API capabilities to include a diverse range of optimized GenAI, embeddings, computer vision, and natural language processing models.

    The platform’s additions include JAIS, the leading Arabic large language model, alongside Azure OpenAI GPT-4 family models, broadening the scope for advanced AI applications.

    “Our partnership drives transformative advancements in AI accessibility and performance. Compass 2.0 facilitates rapid deployment of AI innovations, ensuring seamless integration on an optimized inferencing architecture,” said Kiril Evtimov, group CTO of G42 and CEO of Core42.

    Rashid Attar, VP of Cloud Computing at Qualcomm Technologies, said Compass 2.0 heralds a “new era in Generative AI capabilities.”

    “Through our continued collaboration with G42, we aim to empower organizations globally, enhancing data analytics precision and language comprehension across diverse sectors.”

     

    Topics

    More Like This

    You must to post a comment.

    First time here? : Comment on articles and get access to many more articles.