Self-hosted AI data workflow: DB and Ollama and SQL
exasol.github.ioยท6hยท
Discuss: Hacker News
๐ŸงฎApache Calcite
Preview
Report Post

Overviewยถ

Learn how to invoke open-source Large Language Models (LLMs) directly from your Exasol database using UDFs and Ollama. This tutorial demonstrates a fully self-hosted AI pipeline where your data never leaves your infrastructure.

Why Self-Hosted?ยถ

Running open-source AI models in your own environment offers significant advantages:

Cost Savings

No per-token API fees

No usage-based pricing

One-time model download, unlimited use Data Privacy & Security

Data never leaves your infrastructure

No third-party API calls

Full compliance control Open Source Freedom

Apache 2.0 licensed models (Mistral 7B)

No vendor lock-in

Modify and fine-tune as needed **Pโ€ฆ

Similar Posts

Loading similar posts...

Keyboard Shortcuts

Navigation
Next / previous item
j/k
Open post
oorEnter
Preview post
v
Post Actions
Love post
a
Like post
l
Dislike post
d
Undo reaction
u
Recommendations
Add interest / feed
Enter
Not interested
x
Go to
Home
gh
Interests
gi
Feeds
gf
Likes
gl
History
gy
Changelog
gc
Settings
gs
Browse
gb
Search
/
General
Show this help
?
Submit feedback
!
Close modal / unfocus
Esc

Press ? anytime to show this help