AI4EOSC Platform launches beta LLM4EOSC API service: Open call for preview access

AI4EOSC Platform launches beta LLM4EOSC API service: Open call for preview access

The AI4EOSC platform is excited to announce the deployment of our beta LLM4EOSC (Large Language Models for the EOSCQ) API service. This innovative service is now available for a limited number of users in the European Open Science Cloud (EOSC) community to evaluate and provide feedback.

The beta LLM4EOSC API service offers powerful capabilities for natural language processing and understanding, enabling researchers and scientists to enhance their workflows and projects. By leveraging advanced AI technologies, users can perform tasks such as text generation, summarization, question answering, and more, all within the context of the EOSC.

Open call for preview access

To facilitate a thorough evaluation of the LLM4EOSC API service within the EOSC environment, we are launching an open call for preview access. Interested users from the EOSC community are invited to request access on a limited basis. This opportunity will allow users to explore the potential of LLM APIs in their specific scientific domains and provide valuable feedback that will help shape the future development of the service.

Key features of the beta LLM4EOSC API service

  • Advanced natural language processing: Utilize state-of-the-art LLM capabilities for text understanding and generation.
  • User-friendly interface: Easy-to-use API interface for quick integration into existing workflows.
  • Fine-grained access via API tokens: Secure and controlled access to the service through API tokens, allowing users to manage and monitor usage effectively.

What can you do with the LLM4EOSC API Service

The current service offer provides an OpenAI compatible API providing access to state of the art open LLMs for fast lightweight tasks. Secured via personal API tokens you can use this service to build native AI applications like chatbots, assistants or knowledge retrieval from a given set of documents or documentation.

Although this service is not yet in production, we are opening it as an early preview to gather preliminary feedback of such a service within the EOSC.

What’s next

Looking ahead, the AI4EOSC platform is committed to continually enhancing the LLM API service to meet the evolving needs of the scientific community. Our roadmap includes the following developments:

  • Inclusion of new models : Expand the service to include code-based and multimodal models, offering even more advanced capabilities for a wide range of applications.
  • API key generation within the platform : Streamline the user experience by providing an integrated API key generation feature directly within the AI4EOSC platform.
  • Deployment an open chat Interface with EOSC and AAI Integration : Introduce an open chat interface integrated with the European Open Science Cloud (EOSC) and Authentication and Authorization Infrastructure (AAI), allowing users to interact with the LLM API in a more intuitive and user-friendly manner, enhancing accessibility and ease of use while ensuring secure and seamless authentication.

How to request access

Researchers and scientists interested in previewing the beta LLM API service are encouraged to submit their requests via the following online form. The open call will remain active until 15th of November, or until the limited number of preview slots are filled.

For more information about the beta LLM API service and the open call for preview access, please contact us via ai4eosc-po@listas.csic.es.

Next