High-performance CSV to database ingestion tool that processes millions of rows in seconds with zero-allocation parsing and intelligent schema inference.
Optimized for speed with zero-allocation CSV parsing and parallel processing capabilities.
Everything you need for efficient CSV data ingestion into your database.
Leverages the SEP library for zero-allocation CSV parsing, achieving exceptional throughput of 100,000+ rows per second.
Works seamlessly with all major database providers, adapting to their specific capabilities and optimizations.
Smart automation reduces manual configuration while maintaining full control when needed.
Get started with Noundry.Slurp in seconds using your preferred package manager.
Multiple ways to use Slurp for your data ingestion needs.
# Launch interactive CLI
$ slurp data.csv
? Select database provider:
❯ PostgreSQL
SQL Server
MySQL
SQLite
? Enter server: localhost
? Enter database: mydb
? Enter username: user
? Enter password: ******
✓ Ingested 1,234,567 rows in 12.34 seconds
$ slurp sales_data.csv \
--provider postgres \
--server localhost \
--database analytics \
--username admin \
--password secure123 \
--table sales_2024
$ slurp inventory.csv \
-p sqlserver \
-s "tcp:myserver.database.windows.net,1433" \
-d InventoryDB \
-u sa \
--password "MyStr0ngP@ssw0rd"
$ slurp customers.csv \
--provider sqlite \
--database "./data/customers.db" \
--create-if-missing
using Noundry.Slurp;
using System.Threading.Tasks;
class Program
{
static async Task Main()
{
// Configure the Slurp engine
var config = new SlurpConfiguration
{
Provider = DatabaseProvider.PostgreSQL,
ConnectionString = "Host=localhost;Database=mydb;Username=user;Password=pass",
BatchSize = 10000,
ParallelDegree = 4
};
// Create the engine and ingest data
var engine = new SlurpEngine(config);
// Configure CSV options
var csvOptions = new CsvOptions
{
Delimiter = ',',
HasHeaders = true,
QuoteCharacter = '"'
};
// Perform the ingestion
var result = await engine.IngestAsync(
"data/large_dataset.csv",
"target_table",
csvOptions
);
// Check the results
Console.WriteLine($"✓ Ingested {result.RowCount} rows");
Console.WriteLine($"⚡ Speed: {result.RowsPerSecond:N0} rows/sec");
Console.WriteLine($"⏱️ Time: {result.ElapsedTime}");
}
}
var schema = new TableSchema
{
Name = "products",
Columns = new[]
{
new Column("id", DbType.Int32, isPrimaryKey: true),
new Column("name", DbType.String, maxLength: 200),
new Column("price", DbType.Decimal),
new Column("created", DbType.DateTime)
}
};
engine.IngestWithSchema("products.csv", schema);
var progress = new Progress<SlurpProgress>(p =>
{
Console.WriteLine($"Progress: {p.PercentComplete:P}");
Console.WriteLine($"Rows: {p.RowsProcessed:N0} / {p.TotalRows:N0}");
Console.WriteLine($"Speed: {p.RowsPerSecond:N0} rows/sec");
});
await engine.IngestAsync(
"huge_file.csv",
progress: progress
);
Tips and recommendations for optimal performance and reliability.
Start using Noundry.Slurp today and experience blazing-fast CSV to database ingestion with intelligent automation.