typedef
banner
typedef.ai
typedef
@typedef.ai
We are here to eat bamba and revolutionize the world of query engines. The Spark is gone, let's rethink data processing with a pinch of AI
fenic's Local Data Caching & Persistence keeps expensive AI steps from rerunning and your pipelines resilient.
September 24, 2025 at 1:38 AM
fenic's Multiple Model Configuration & Selection lets you pick the right model for each step, cheap where you can, powerful where you must.

Think of it as a per-operator model dial across your pipeline.
September 22, 2025 at 11:07 PM
fenic's Structured Output Extraction turns LLM text into validated tables, directly in your DataFrame.

Think of it as schema-first parsing: you define a Pydantic model; Fenic enforces it and returns structured columns.
September 20, 2025 at 1:38 AM
fenic's First-Class AI Data Types make embeddings, markdown, and JSON real, typed columns, with the right operations built in.

Think of it as strong types for meaning and structure: safer pipelines, richer queries.
September 18, 2025 at 1:38 AM
fenic's Semantic Classification turns free-text into clean enums right inside your DataFrame.
September 16, 2025 at 1:38 AM
fenic's Semantic Similarity Join (Vector Join) finds nearest neighbors across tables using embeddings, right inside your DataFrame.
September 12, 2025 at 2:26 AM
fenic 0.4.0 is live: declarative tools for agents, a production-ready MCP server, and direct reads from HuggingFace plus big DX & reliability gains. 

Highlights:

Declarative tools: define function-calling tools as data (type-safe, reviewable, reusable).
September 9, 2025 at 9:20 PM
fenic ensures LLM outputs conform to schemas using Pydantic models.
September 8, 2025 at 5:45 PM
fenic offers standard DataFrame operations with a familiar Spark/Pandas-like API.
September 6, 2025 at 2:44 AM
fenic UDFs allow you to inject arbitrary Python (including external libraries) directly into the fenic execution plan while preserving lazy planning, metrics and reproducibility.
September 5, 2025 at 2:44 AM
fenic offers Flexible Session Configuration. Define AI providers/models (OpenAI GPT‑4, Claude, etc.) and RPM/TPM rate limits at session start so the framework centrally manages throttling, costs and consistency across all AI calls.
September 3, 2025 at 3:57 PM
Example: group support tickets by account_id, sort by time, produce one weekly summary per account. Think of it as the reduce() for unstructured text.
August 29, 2025 at 2:44 AM
Semantic Reduce = apply LLM aggregation at group-by scale. Many rows in → one summary per group. You write a prompt and fenic handles packing, ordering, and multi-pass reduction.
August 29, 2025 at 2:44 AM
This example turns name, details into a one-line job blurb. Think of it as the map() for LLM reasoning inside your DataFrame.

Today, teams hand-roll loops and JSON parsing, fight rate limits, and wire vector lookups. Glue grows. Throughput drops and pipelines become increasingly brittle.
August 25, 2025 at 11:07 PM
Semantic Map is how you apply inference at scale with fenic. Whether it’s 1 call or 1M, it’s ~5 lines.

You iterate on data and prompts; Fenic handles the rest. Write a Jinja template with column placeholders; Fenic renders it per row and calls the model.
August 25, 2025 at 11:07 PM
AI-native data types in Fenic. in this demo: the Markdown type.
August 21, 2025 at 5:02 PM
Fenic adds AI-native scalar functions to DataFrames. They run LLM inference over columns/rows, so reasoning lives inside your pipeline.
August 20, 2025 at 6:45 PM
Semantic join is one of Fenic’s AI-native DataFrame functions. They operate over whole tables and relationships—not just individual rows.
August 20, 2025 at 1:46 AM
What an amazing night hosting some of the Peninsula’s brightest minds in data. Delicious tacos, amazing tech talks, and the 2025 view of data by Tomasz from TheoryVC!
January 28, 2025 at 6:52 PM