Noundry.Slurp

High-performance CSV to database ingestion tool that processes millions of rows in seconds with zero-allocation parsing and intelligent schema inference.

Blazing Fast Performance

Optimized for speed with zero-allocation CSV parsing and parallel processing capabilities.

100K+
Rows per second
Zero
Memory allocations
4+
Database providers
Smart
Schema inference

Powerful Features

Everything you need for efficient CSV data ingestion into your database.

High Performance

Leverages the SEP library for zero-allocation CSV parsing, achieving exceptional throughput of 100,000+ rows per second.

  • Minimal memory footprint
  • Parallel processing
  • Streaming support

Multi-Database Support

Works seamlessly with all major database providers, adapting to their specific capabilities and optimizations.

  • SQL Server
  • PostgreSQL
  • MySQL & SQLite

Intelligent Features

Smart automation reduces manual configuration while maintaining full control when needed.

  • Automatic schema inference
  • Intelligent indexing
  • Type detection

Installation

Get started with Noundry.Slurp in seconds using your preferred package manager.

Package Installation

$ dotnet add package Noundry.Slurp

Global CLI Tool

$ dotnet tool install -g Noundry.Slurp

Requirements

  • • .NET 8.0 or 9.0 SDK
  • • Supported database (SQL Server, PostgreSQL, MySQL, or SQLite)
  • • CSV files with consistent structure

Usage Examples

Multiple ways to use Slurp for your data ingestion needs.

Interactive Mode

Simple Interactive CLI

# Launch interactive CLI
$ slurp data.csv

? Select database provider: 
  ❯ PostgreSQL
    SQL Server
    MySQL
    SQLite

? Enter server: localhost
? Enter database: mydb
? Enter username: user
? Enter password: ******

✓ Ingested 1,234,567 rows in 12.34 seconds

Features

  • Beautiful interactive prompts guide you through configuration
  • Automatic detection of CSV format and encoding
  • Progress bar with real-time statistics

Command Line Mode

PostgreSQL Example

$ slurp sales_data.csv \
    --provider postgres \
    --server localhost \
    --database analytics \
    --username admin \
    --password secure123 \
    --table sales_2024

SQL Server Example

$ slurp inventory.csv \
    -p sqlserver \
    -s "tcp:myserver.database.windows.net,1433" \
    -d InventoryDB \
    -u sa \
    --password "MyStr0ngP@ssw0rd"

SQLite Example

$ slurp customers.csv \
    --provider sqlite \
    --database "./data/customers.db" \
    --create-if-missing

Library Usage in Code

using Noundry.Slurp;
using System.Threading.Tasks;

class Program
{
    static async Task Main()
    {
        // Configure the Slurp engine
        var config = new SlurpConfiguration
        {
            Provider = DatabaseProvider.PostgreSQL,
            ConnectionString = "Host=localhost;Database=mydb;Username=user;Password=pass",
            BatchSize = 10000,
            ParallelDegree = 4
        };

        // Create the engine and ingest data
        var engine = new SlurpEngine(config);

        // Configure CSV options
        var csvOptions = new CsvOptions
        {
            Delimiter = ',',
            HasHeaders = true,
            QuoteCharacter = '"'
        };

        // Perform the ingestion
        var result = await engine.IngestAsync(
            "data/large_dataset.csv",
            "target_table",
            csvOptions
        );

        // Check the results
        Console.WriteLine($"✓ Ingested {result.RowCount} rows");
        Console.WriteLine($"⚡ Speed: {result.RowsPerSecond:N0} rows/sec");
        Console.WriteLine($"⏱️ Time: {result.ElapsedTime}");
    }
}

Advanced Configuration

Schema Customization

var schema = new TableSchema
{
    Name = "products",
    Columns = new[]
    {
        new Column("id", DbType.Int32, isPrimaryKey: true),
        new Column("name", DbType.String, maxLength: 200),
        new Column("price", DbType.Decimal),
        new Column("created", DbType.DateTime)
    }
};

engine.IngestWithSchema("products.csv", schema);

Progress Monitoring

var progress = new Progress<SlurpProgress>(p =>
{
    Console.WriteLine($"Progress: {p.PercentComplete:P}");
    Console.WriteLine($"Rows: {p.RowsProcessed:N0} / {p.TotalRows:N0}");
    Console.WriteLine($"Speed: {p.RowsPerSecond:N0} rows/sec");
});

await engine.IngestAsync(
    "huge_file.csv",
    progress: progress
);

Best Practices

Tips and recommendations for optimal performance and reliability.

Performance Tips

  • Use appropriate batch sizes (10,000-50,000 rows) for your database
  • Enable parallel processing for large files (>1M rows)
  • Disable indexes during bulk load, recreate after
  • Use SSD storage for temporary data when possible

Data Preparation

  • Ensure CSV files have consistent structure and encoding
  • Remove or handle special characters in headers
  • Validate date formats match database expectations
  • Consider pre-sorting data by primary key if possible

Ready to Accelerate Your Data Ingestion?

Start using Noundry.Slurp today and experience blazing-fast CSV to database ingestion with intelligent automation.