this post was submitted on 23 May 2025
7 points (100.0% liked)

cybersecurity

4135 readers
50 users here now

An umbrella community for all things cybersecurity / infosec. News, research, questions, are all welcome!

Community Rules

Enjoy!

founded 2 years ago
MODERATORS
 

This project implements a FastAPI-based local server designed to load one or more pre-trained NLP models during startup and expose them through a clean, RESTful API for inference.

For example, it leverages the Hugging Face transformers library to load the CIRCL/vulnerability-severity-classification-distilbert-base-uncased model, which specializes in classifying vulnerability descriptions according to their severity level. The server initializes this model once at startup, ensuring minimal latency during inference requests.

Clients interact with the server via dedicated HTTP endpoints corresponding to each loaded model. Additionally, the server automatically generates comprehensive OpenAPI documentation that details the available endpoints, their expected input formats, and sample responses—making it easy to explore and integrate the services.

The ultimate goal is to enrich vulnerability data descriptions through the application of a suite of NLP models, providing direct benefits to Vulnerability-Lookup and supporting other related projects.

Conceptual architecture

top 1 comments
sorted by: hot top controversial new old
[–] muntedcrocodile@lemm.ee 0 points 16 hours ago

Seems fun like I could just plug it into openwebui docker compose. Mind you is the only benefit of this that u can run non ollama compatible models.