BoxLang 🚀 A New JVM Dynamic Language Learn More...

BoxLang CSV

v1.0.0+2 BoxLang Modules

⚡︎ BoxLang Module: BoxLang CSV

|:------------------------------------------------------: |
| ⚡︎ B o x L a n g ⚡︎
| Dynamic : Modular : Productive |
| :----------------------------:  |
Copyright Since 2023 by Ortus Solutions, Corp
www.boxlang.io | www.ortussolutions.com

 

📋 Overview

The BoxLang CSV module provides a comprehensive and modern API for working with CSV (Comma-Separated Values) files in BoxLang. Built on top of Apache Commons CSV, this module offers both traditional BIF-style functions and a powerful fluent API for creating, reading, parsing, and manipulating CSV files.

✨ Key Features

  • 📊 Create & Manipulate - Create new CSV files or work with existing files
  • 🔄 Format Flexibility - Support for custom delimiters, quotes, escapes, and line separators
  • Fluent API - Modern, chainable interface for elegant code
  • 🎯 Type Support - Handles strings, numbers, dates, and complex data
  • 📦 Export/Import - Convert to/from JSON, Query, and arrays of structs
  • 🚀 Streaming - Memory-efficient processing for large files with process()
  • 🔍 Filter & Map - Transform data on-the-fly during load operations filter() map()
  • 🎨 Header Support - Automatic header detection and manipulation
  • 🛠️ Configurable - Extensive configuration options for parsing and writing

📦 Installation

Install the module using CommandBox:

box install bx-csv

🚀 Quick Start

Here's a taste of what you can do with the fluent API:

// Create and write a CSV file with fluent API (recommended)
CSV()
    .setHeaders( "Name", "Email", "Age" )
    .addRow( [ "John Doe", "[email protected]", 30 ] )
    .addRow( [ "Jane Smith", "[email protected]", 25 ] )
    .save( "contacts.csv" );

// Load and parse an existing CSV file
data = CSV( "data.csv" ).toArray();

// Stream process a large file
CSV().process( "huge-file.csv", ( row ) => {
    // Process each row without loading entire file into memory
    println( "Name: " & row[1] );
} );

💡 Best Practice: We recommend using the fluent CSV() API directly over the individual BIFs for a more modern, readable, and maintainable codebase.

📚 Table of Contents


🎯 Basic Usage

Creating a New CSV File

There are multiple ways to create CSV files: using the fluent API (recommended), traditional BIFs, or static constructors.

// Create a new empty CSV file
csv = CSV();

// Create from a path
csv = CSV( "data.csv" );

// Create with custom delimiter
csv = CSV( path="data.csv", delimiter="|" );

// Create with configuration
csv = CSV()
    .delimiter( '|' )
    .headers( true )
    .trim( true );

Using Static Constructors

// Create from JSON
jsonData = '[{"name":"John","age":30},{"name":"Jane","age":25}]';
csv = CSVFile.fromJson( jsonData );

// Create from Array of Structs
data = [
    { "name": "John", "age": 30 },
    { "name": "Jane", "age": 25 }
];
csv = CSVFile.fromArray( data );

// Create from Query
qData = queryExecute( "SELECT name, email FROM users" );
csv = CSVFile.fromQuery( qData );

// Create empty
csv = CSVFile.empty();

Writing Data

Setting Headers and Adding Rows

// Fluent API - build CSV data
CSV()
    .setHeaders( "Name", "Email", "Age" )
    .addRow( [ "John Doe", "[email protected]", 30 ] )
    .addRow( [ "Jane Smith", "[email protected]", 25 ] )
    .addRow( [ "Bob Johnson", "[email protected]", 35 ] )
    .save( "contacts.csv" );

Setting Row Data by Row Number

// Set specific rows (1-based indexing)
CSV()
    .setRowData( 1, [ "Product", "Price", "Quantity" ] )
    .setRowData( 2, [ "Widget A", 10.99, 100 ] )
    .setRowData( 3, [ "Widget B", 15.99, 50 ] )
    .save( "products.csv" );

Reading Data

Loading from File

// Load entire file into memory
csv = CSV( "data.csv" );

// Or load explicitly
csv = CSV().load( "data.csv" );

// Get all data as array
allData = csv.toArray();

// Get headers
headers = csv.getHeaders();

// Get row count
rowCount = csv.getRowCount();

// Get specific row (1-based)
row2 = csv.getRowData( 2 );

Loading from String

// Parse CSV content from a string
csvContent = "Name,Age,City
John,30,NYC
Jane,25,LA";

csv = CSV().loadFromString( csvContent );
data = csv.toArray();

Loading with Configuration

// Load with custom delimiter and options
csv = CSV()
    .delimiter( '|' )
    .trim( true )
    .skipHeaderRecord( false )
    .load( "data.txt" );

Saving CSV Files

// Save to file
csv = CSV()
    .setHeaders( "Col1", "Col2", "Col3" )
    .addRow( [ "A", "B", "C" ] )
    .save( "output.csv" );

// Save with overwrite
csv.overwrite( true ).save( "existing.csv" );

// Save to specific path
csv.setPath( "reports/data.csv" ).save();

Exporting and Importing Data

The CSVFile class provides convenient methods for exporting CSV data to various formats and importing data from those formats.

Exporting to JSON

// Create CSV and export to JSON
csv = CSV()
    .setHeaders( "Name", "Age", "City" )
    .addRow( [ "John", 30, "New York" ] )
    .addRow( [ "Jane", 25, "Los Angeles" ] );

jsonString = csv.toJson();
// Returns: [{"Name":"John","Age":"30","City":"New York"},{"Name":"Jane","Age":"25","City":"Los Angeles"}]

// Pretty-print JSON
jsonPretty = csv.toJson( true );

Exporting to Query

// Convert CSV to BoxLang Query object
csv = CSV()
    .setHeaders( "Product", "Price", "Quantity" )
    .addRow( [ "Widget A", "10.99", "100" ] )
    .addRow( [ "Widget B", "15.99", "50" ] );

query = csv.toQuery();
// Query object with VARCHAR columns

Exporting to Array

// Export as array of arrays
csv = CSV( "data.csv" );
arrayData = csv.toArray();

Exporting to String

// Convert to CSV string
csv = CSV()
    .setHeaders( "A", "B", "C" )
    .addRow( [ "1", "2", "3" ] );

csvString = csv.toString();
// Returns: "A,B,C\n1,2,3\n"

Importing from JSON

// Create CSV from JSON string
jsonData = '[{"Name":"Alice","Score":95},{"Name":"Bob","Score":87}]';
csv = CSVFile.fromJson( jsonData );
csv.save( "scores.csv" );

Importing from Array of Structs

// Create CSV from array of structs
data = [
    { "Product": "Widget A", "Price": 10.99, "InStock": true },
    { "Product": "Widget B", "Price": 15.99, "InStock": false }
];

csv = CSVFile.fromArray( data );
csv.save( "products.csv" );

Importing from Query

// Create CSV from query results
qData = queryExecute( "SELECT name, email, status FROM users" );
csv = CSVFile.fromQuery( qData );
csv.save( "users-export.csv" );

Round-trip Export/Import Example

// Export to JSON, modify, and import back
original = CSV()
    .setHeaders( "Name", "Email" )
    .addRow( [ "John", "[email protected]" ] );

json = original.toJson();

// Send JSON to API, save to database, etc.
// ...

// Later, import back
imported = CSVFile.fromJson( json );
imported.save( "restored.csv" );

🎨 Intermediate Usage

Working with Headers

// CSV with headers enabled (default)
csv = CSV()
    .headers( true )
    .setHeaders( "Name", "Age", "Email" )
    .addRow( [ "John", 30, "[email protected]" ] );

// Get headers
headers = csv.getHeaders();
// Returns: ["Name", "Age", "Email"]

// CSV without headers
csv = CSV()
    .headers( false )
    .addRow( [ "John", "30", "[email protected]" ] )
    .addRow( [ "Jane", "25", "[email protected]" ] );

Custom Delimiters and Quotes

// Pipe-delimited file
csv = CSV()
    .delimiter( '|' )
    .setHeaders( "Name", "Age", "City" )
    .addRow( [ "John", 30, "New York" ] )
    .save( "data.txt" );

// Tab-delimited (TSV)
csv = CSV()
    .delimiter( '\t' )
    .load( "data.tsv" );

// Custom quote character
csv = CSV()
    .quote( '\'' )
    .escape( '\'' )
    .load( "data.csv" );

Filtering and Mapping Data

// Filter rows during load (only include adults)
csv = CSV()
    .filter( ( row ) => {
        return row[2] >= 18; // Age column
    } )
    .load( "people.csv" );

// Transform rows during load (uppercase names)
csv = CSV()
    .map( ( row ) => {
        row[0] = row[0].toUpperCase();
        return row;
    } )
    .load( "data.csv" );

// Combine filter and map
csv = CSV()
    .filter( ( row ) => row[1] != "" ) // Skip empty emails
    .map( ( row ) => {
        row[1] = row[1].trim().toLowerCase();
        return row;
    } )
    .load( "contacts.csv" );

Working with Different Line Separators

// Unix line endings
csv = CSV()
    .lineSeparator( "\n" )
    .save( "unix.csv" );

// Windows line endings
csv = CSV()
    .lineSeparator( "\r\n" )
    .save( "windows.csv" );

// Mac classic line endings
csv = CSV()
    .lineSeparator( "\r" )
    .save( "mac.csv" );

Trimming and Space Handling

// Trim whitespace from values
csv = CSV()
    .trim( true )
    .load( "data.csv" );

// Ignore surrounding spaces (more comprehensive than trim)
csv = CSV()
    .ignoreSurroundingSpaces( true )
    .load( "data.csv" );

Handling Empty Lines and Comments

// Skip empty lines (default: true)
csv = CSV()
    .ignoreEmptyLines( true )
    .load( "data.csv" );

// Support comment lines
csv = CSV()
    .commentMarker( '#' )
    .load( "data.csv" );
// Lines starting with # will be ignored

// Add header comments when writing
csv = CSV()
    .commentMarker( '#' )
    .headerComments( "Generated on " & now(), "Author: System" )
    .setHeaders( "Col1", "Col2" )
    .addRow( [ "A", "B" ] )
    .save( "data.csv" );

Null String Handling

// Represent null values as "NULL"
csv = CSV()
    .nullString( "NULL" )
    .addRow( [ "John", null, "NYC" ] )
    .save( "data.csv" );
// Output: John,NULL,NYC

// Read null strings back
csv = CSV()
    .nullString( "NULL" )
    .load( "data.csv" );

Quote Modes

// Quote all values
csv = CSV()
    .quoteMode( "ALL" )
    .save( "quoted.csv" );

// Quote only when necessary (default)
csv = CSV()
    .quoteMode( "MINIMAL" )
    .save( "minimal.csv" );

// Quote non-numeric values
csv = CSV()
    .quoteMode( "NON_NUMERIC" )
    .save( "numbers.csv" );

// Never quote
csv = CSV()
    .quoteMode( "NONE" )
    .save( "unquoted.csv" );

🚀 Advanced Usage

Streaming Large Files

For large CSV files that don't fit in memory, use the process() method to stream through rows:

// Process a large file without loading into memory
CSV().process( "huge-file.csv", ( row ) => {
    // Process each row individually
    println( "Processing: " & row[1] );

    // Do something with the row (database insert, API call, etc.)
    queryExecute(
        "INSERT INTO table (col1, col2) VALUES (?, ?)",
        [ row[1], row[2] ]
    );
} );

// Process with pre-configured CSV settings
CSV()
    .delimiter( '|' )
    .trim( true )
    .filter( ( row ) => row[2] > 0 ) // Only process positive values
    .process( "data.txt", ( row ) => {
        // Process filtered rows
        processRow( row );
    } );

// Stream with path set beforehand
CSV()
    .setPath( "large-file.csv" )
    .process( ( row ) => {
        // Process each row
        println( row[0] & ": " & row[1] );
    } );

Complex Fluent API Examples

Building a Report from Database Query

// Query database and export to CSV
qData = queryExecute( "
    SELECT
        customer_name,
        order_date,
        product_name,
        quantity,
        unit_price,
        (quantity * unit_price) AS total
    FROM orders
    WHERE order_date >= ?
", [ dateAdd( "m", -1, now() ) ] );

// Convert to CSV with formatting
CSVFile.fromQuery( qData )
    .delimiter( ',' )
    .quoteMode( "MINIMAL" )
    .trim( true )
    .save( "reports/monthly-orders.csv" );

Loading, Transforming, and Saving

// Load existing CSV, transform data, save to new file
CSV( "input/raw-data.csv" )
    .filter( ( row ) => {
        // Only include rows with valid email
        return row[2].find( "@" ) > 0;
    } )
    .map( ( row ) => {
        // Normalize data
        row[0] = row[0].trim().toUpperCase(); // Name
        row[1] = row[1].trim(); // Phone
        row[2] = row[2].trim().toLowerCase(); // Email
        return row;
    } )
    .trim( true )
    .save( "output/cleaned-data.csv" );

Processing Multiple Files and Combining

// Combine multiple CSV files into one
combined = CSV().setHeaders( "Name", "Email", "Source" );

// Process each file and add to combined
[ "file1.csv", "file2.csv", "file3.csv" ].each( ( file ) => {
    CSV().process( file, ( row ) => {
        // Skip header row if present
        if ( row[1] != "Name" ) {
            // Add source file name
            combined.addRow( [ row[1], row[2], file ] );
        }
    } );
} );

combined.save( "combined-output.csv" );

Converting Between Formats

// CSV to JSON
CSV( "data.csv" ).toJson( true ); // Pretty-printed

// JSON to CSV
CSVFile.fromJson( myJsonString ).save( "output.csv" );

// Query to CSV
CSVFile.fromQuery( myQuery ).save( "export.csv" );

// CSV to Query
query = CSV( "data.csv" ).toQuery();

Handling Duplicate Headers

// Allow duplicate column names
csv = CSV()
    .allowDuplicateHeaderNames( true )
    .load( "data-with-dupes.csv" );

// Disallow duplicates (throws exception)
csv = CSV()
    .allowDuplicateHeaderNames( false )
    .load( "data.csv" );

Skip Header Record Configuration

// Skip first row as header (default: true)
csv = CSV()
    .skipHeaderRecord( true )
    .load( "data.csv" );
// First row becomes column names, not included in data

// Include header row in data (skipHeaderRecord=false)
csv = CSV()
    .skipHeaderRecord( false )
    .load( "data.csv" );
// First row is treated as data

Auto-flush and Trailing Delimiters

// Auto-flush after each record (better reliability, slower)
csv = CSV()
    .autoFlush( true )
    .addRow( [ "A", "B", "C" ] )
    .save( "data.csv" );

// Add trailing delimiter at end of each row
csv = CSV()
    .trailingDelimiter( true )
    .save( "data.csv" );
// Output: A,B,C,

Performance Tips

// For large in-memory datasets:
// - Use filter/map to reduce data size during load
// - Use streaming process() for files > 100MB

// For frequent writes:
// - Disable autoFlush for better performance
// - Build data in memory, save once

// For reading multiple files:
// - Use process() to avoid memory issues
// - Consider parallel processing in separate threads

// Example: Efficient processing
CSV()
    .ignoreEmptyLines( true )
    .trim( true )
    .filter( ( row ) => row[1] != "" ) // Skip invalid rows early
    .process( "large-file.csv", ( row ) => {
        // Minimal processing per row
        saveToDatabase( row );
    } );

📖 BIF Reference

Core Functions

CSV( [path], [delimiter], [hasHeaders], [trim], [ignoreEmptyLines], [skipHeaderRecord] )

Returns a fluent CSVFile object for creating or manipulating CSV files.

Parameters:

  • path (optional, string) - Path to load an existing file or where the file will be saved
  • delimiter (optional, string) - Delimiter character (default: ",")
  • hasHeaders (optional, boolean) - Whether CSV has headers (default: true)
  • trim (optional, boolean) - Whether to trim whitespace (default: true)
  • ignoreEmptyLines (optional, boolean) - Whether to ignore empty lines (default: true)
  • skipHeaderRecord (optional, boolean) - Whether to skip header record when parsing (default: true)

Returns: CSVFile object

Example:

// Create new
csv = CSV();

// Load existing
csv = CSV( "data.csv" );

// Load with custom delimiter
csv = CSV( path="data.txt", delimiter="|" );

// Load with full configuration
csv = CSV(
    path="data.csv",
    delimiter=",",
    hasHeaders=true,
    trim=true,
    ignoreEmptyLines=true,
    skipHeaderRecord=true
);

CSVFile Fluent API Methods

The CSVFile class provides a fluent, chainable API for all CSV operations.

Static Constructors

CSVFile.empty()

  • Creates an empty CSVFile instance
  • Returns: CSVFile object

CSVFile.fromJson( jsonString )

  • Creates CSVFile from JSON array of objects
  • Parameters: jsonString (string) - JSON array
  • Returns: CSVFile object
  • Example: CSVFile.fromJson('[{"name":"John","age":30}]')

CSVFile.fromArray( arrayOfStructs )

  • Creates CSVFile from array of structs
  • Parameters: arrayOfStructs (array) - Array of structs
  • Returns: CSVFile object

CSVFile.fromQuery( query )

  • Creates CSVFile from BoxLang Query
  • Parameters: query (Query) - Query object
  • Returns: CSVFile object

File I/O Methods

load( path )

  • Loads CSV file from specified path
  • Parameters: path (string) - File path
  • Returns: this (for chaining)
  • Example: csv.load("data.csv")

loadFromString( csvContent )

  • Loads CSV data from a string
  • Parameters: csvContent (string) - CSV content
  • Returns: this (for chaining)

loadFromJson( jsonString )

  • Loads CSV data from JSON string
  • Parameters: jsonString (string) - JSON array
  • Returns: this (for chaining)

loadFromQuery( query )

  • Loads CSV data from Query object
  • Parameters: query (Query) - Query object
  • Returns: this (for chaining)

loadFromArray( arrayOfStructs )

  • Loads CSV data from array of structs
  • Parameters: arrayOfStructs (array) - Array of structs
  • Returns: this (for chaining)

save( path, [overwrite] )

  • Saves CSV file to specified path
  • Parameters:
    • path (string) - File path
    • overwrite (optional, boolean) - Whether to overwrite existing file
  • Returns: this (for chaining)
  • Example: csv.save("output.csv", true)

save()

  • Saves CSV file using previously set path
  • Returns: this (for chaining)

Configuration Methods

delimiter( char )

  • Sets delimiter character
  • Parameters: char (character) - Delimiter (e.g., ',', '|', '\t')
  • Returns: this (for chaining)
  • Example: csv.delimiter('|')

quote( char )

  • Sets quote character
  • Parameters: char (character) - Quote character (default: '"')
  • Returns: this (for chaining)

escape( char )

  • Sets escape character
  • Parameters: char (character) - Escape character (default: '"')
  • Returns: this (for chaining)

lineSeparator( string )

  • Sets line separator string
  • Parameters: string (string) - Line separator (e.g., "\n", "\r\n")
  • Returns: this (for chaining)

trim( boolean )

  • Sets whether to trim whitespace from values
  • Parameters: boolean (boolean) - Enable/disable trimming
  • Returns: this (for chaining)

headers( boolean )

  • Sets whether CSV has headers in first row
  • Parameters: boolean (boolean) - Enable/disable headers
  • Returns: this (for chaining)

skipHeaderRecord( boolean )

  • Sets whether to skip header record when parsing
  • Parameters: boolean (boolean) - Enable/disable skipping
  • Returns: this (for chaining)

allowMissingColumnNames( boolean )

  • Sets whether to allow missing column names
  • Parameters: boolean (boolean) - Enable/disable
  • Returns: this (for chaining)

ignoreEmptyLines( boolean )

  • Sets whether to ignore empty lines
  • Parameters: boolean (boolean) - Enable/disable
  • Returns: this (for chaining)

nullString( string )

  • Sets string representation for null values
  • Parameters: string (string) - Null representation (e.g., "NULL", "N/A")
  • Returns: this (for chaining)

commentMarker( char )

  • Sets comment marker character
  • Parameters: char (character) - Comment marker (e.g., '#')
  • Returns: this (for chaining)

ignoreSurroundingSpaces( boolean )

  • Sets whether to ignore spaces around values
  • Parameters: boolean (boolean) - Enable/disable
  • Returns: this (for chaining)

quoteMode( string )

  • Sets quote mode for writing
  • Parameters: string (string) - Quote mode: "ALL", "MINIMAL", "NON_NUMERIC", "NONE", "ALL_NON_NULL"
  • Returns: this (for chaining)
  • Example: csv.quoteMode("ALL")

allowDuplicateHeaderNames( boolean )

  • Sets whether to allow duplicate column names
  • Parameters: boolean (boolean) - Enable/disable
  • Returns: this (for chaining)

autoFlush( boolean )

  • Sets whether to flush after each record write
  • Parameters: boolean (boolean) - Enable/disable
  • Returns: this (for chaining)

trailingDelimiter( boolean )

  • Sets whether to add trailing delimiter at end of records
  • Parameters: boolean (boolean) - Enable/disable
  • Returns: this (for chaining)

headerComments( ...comments )

  • Sets header comments to write before CSV data
  • Parameters: comments (string...) - Variable arguments of comment strings
  • Returns: this (for chaining)
  • Example: csv.headerComments("File created on " & now(), "Author: System")

overwrite( boolean )

  • Sets whether to overwrite existing files when saving
  • Parameters: boolean (boolean) - Enable/disable overwriting
  • Returns: this (for chaining)

setPath( path )

  • Sets file path for future operations
  • Parameters: path (string) - File path
  • Returns: this (for chaining)

Data Manipulation Methods

setHeaders( ...headers )

  • Sets header columns for the CSV
  • Parameters: headers (string...) - Variable arguments of header names
  • Returns: this (for chaining)
  • Example: csv.setHeaders("Name", "Email", "Age")

setRowData( rowNumber, rowData )

  • Sets data for a specific row (1-based indexing)
  • Parameters:
    • rowNumber (number) - Row number (1-based)
    • rowData (array) - Array of values for the row
  • Returns: this (for chaining)
  • Example: csv.setRowData(1, ["A", "B", "C"])

addRow( rowData )

  • Adds a new row to the CSV
  • Parameters: rowData (array) - Array of values for the row
  • Returns: this (for chaining)
  • Example: csv.addRow(["John", "[email protected]", 30])

getRowData( rowNumber )

  • Gets data for a specific row (1-based indexing)
  • Parameters: rowNumber (number) - Row number (1-based)
  • Returns: Array of values

getRowCount()

  • Gets the number of rows in the CSV
  • Returns: Number of rows

getColumnCount()

  • Gets the number of columns in the CSV
  • Returns: Number of columns

getHeaders()

  • Gets the header columns
  • Returns: Array of header names, or null if no headers

clear()

  • Clears all data from the CSV
  • Returns: this (for chaining)

Filtering and Mapping Methods

filter( predicate )

  • Sets a filter predicate to apply when loading CSV data
  • Parameters: predicate (function) - Function that receives a row array and returns boolean
  • Returns: this (for chaining)
  • Example: csv.filter((row) => row[2] >= 18)

map( mapper )

  • Sets a mapper function to transform rows when loading CSV data
  • Parameters: mapper (function) - Function that receives and returns a row array
  • Returns: this (for chaining)
  • Example: csv.map((row) => { row[0] = row[0].toUpperCase(); return row; })

Streaming Methods

process( path, consumer )

  • Streams through CSV file row-by-row without loading into memory
  • Parameters:
    • path (string) - Path to CSV file
    • consumer (function) - Function that receives each row array
  • Returns: this (for chaining)
  • Example: csv.process("large.csv", (row) => println(row[0]))

process( consumer )

  • Streams through CSV file using previously set path
  • Parameters: consumer (function) - Function that receives each row array
  • Returns: this (for chaining)

Export Methods

toArray()

  • Converts CSV data to array of arrays
  • Returns: Array

toJson( [pretty] )

  • Converts CSV data to JSON string
  • Parameters: pretty (optional, boolean) - Pretty-print JSON (default: false)
  • Returns: JSON string

toQuery()

  • Converts CSV data to BoxLang Query object
  • Returns: Query object with VARCHAR columns

toString()

  • Converts CSV data to string representation
  • Returns: CSV string

Information Methods

getPath()

  • Gets the current file path
  • Returns: File path string or null

getDelimiter()

  • Gets the current delimiter character
  • Returns: Character

hasHeaders()

  • Gets whether the CSV has headers
  • Returns: Boolean

getData()

  • Gets the raw CSV data as list of string arrays
  • Returns: List of string arrays

THE DAILY BREAD

"I am the way, and the truth, and the life; no one comes to the Father, but by me (JESUS)" - John 14:6

Changelog

All notable changes to this project will be documented in this file.

The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.


Unreleased

1.0.0 - 2025-10-28

  • First iteration of this module with BIF support for spreadsheet operations

$ box install bx-csv

No collaborators yet.
     
  • {{ getFullDate("2025-10-28T11:47:31Z") }}
  • {{ getFullDate("2025-10-28T20:24:24Z") }}
  • 27
  • 2