DevToys.PocoCsv.Core 4.5.3

dotnet add package DevToys.PocoCsv.Core --version 4.5.3                
NuGet\Install-Package DevToys.PocoCsv.Core -Version 4.5.3                
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="DevToys.PocoCsv.Core" Version="4.5.3" />                
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add DevToys.PocoCsv.Core --version 4.5.3                
#r "nuget: DevToys.PocoCsv.Core, 4.5.3"                
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install DevToys.PocoCsv.Core as a Cake Addin
#addin nuget:?package=DevToys.PocoCsv.Core&version=4.5.3

// Install DevToys.PocoCsv.Core as a Cake Tool
#tool nuget:?package=DevToys.PocoCsv.Core&version=4.5.3                

DevToys.PocoCsv.Core

DevToys.PocoCsv.Core is a class library to read and write to Csv very fast. It contains CsvStreamReader, CsvStreamWriter and Serialization classes CsvReader<T> and CsvWriter<T>.
It provides plenty of options on how you would either read from or write to CSV files.

  • Extremely fast.
  • Handle unlimited file sizes.
  • RFC 4180 compliant.
  • Sequential read with ReadAsEnumerable().
  • Csv schema Retrieval with CsvUtils.GetCsvSchema().
  • DataTable import and export.
  • Deserialiser / serializer.
  • Stream reader / writer.
  • Works for all encoding types.

CsvStreamReader

    string _file = @"C:\Temp\data.csv";
    using (CsvStreamReader _reader = new CsvStreamReader(_file))
    {
        while (!_reader.EndOfStream)
        {
            string[] _resultArray = _reader.ReadCsvLine();
        }
    }

or

    string _file = @"C:\Temp\data.csv";
    using (CsvStreamReader _reader = new CsvStreamReader(_file))
    {
        _reader.SetColumnIndexes(2,5); // only include column 2 and 5 in the result array. This is optional.

        foreach (string[] items in _reader.ReadAsEnumerable())
        {
            
        }
    }

or use the string[] decontruct extension methods (max 10 parameters)

    using (CsvStreamReader _reader = new CsvStreamReader(_file))
    {
        foreach(var (first, second, third) in _reader.ReadAsEnumerable())
        {

        }
    }

or use the Dictionary functions
Note: this option may have some performance degradation.

    using (CsvStreamReader _reader = new CsvStreamReader(_file))
    {
        while (!_reader.EndOfStream)
        {            
            Dictionary<string,string> _values = _reader.ReadCsvLineAsDictionary(); 
            string _id = _values["Id"];
            string _name = _values["Name"];
        }
    }
Methods / Property Description
CurrentLine Returns the current line number
DetectSeparator() Detect the separator by sampling first 10 rows. Position is moved to start after execution.
EndOfStream Indicates the stream has ended.
GetCsvSchema() Returns a schema for the CSV with best fitted types to use.
GetCsvSeparator() Detects and sets CSV Separator.
MoveToStart() Move reader to the start position 0
Position Get / Sets the position.
ReadAsEnumerable() Each iteration will read the next row from stream or file
ReadCsvLine() Reads the CSV line into string array, and advances to the next.
ReadCsvLineAsDictionary() Assumes first line is the Header with column names.
ReadAsEnumerableDictionary() Assumes first line is the Header with column names.
ReadLine() Perform ReadCsvLine.
ResetColumnIndexes() Reset the column indexes to default, including all columns in the result array.
Separator Get / Sets the Separator character to use.
SetColumnIndexes() Limit the result array for ReadCsvLine to only these columns.
Skip() Use to skip first row without materializing, usefull for skipping header.

CsvStreamWriter

    string file = @"D:\Temp\test.csv";
    using (CsvStreamWriter _writer = new CsvStreamWriter(file))
    {
        var _line = new string[] { "Row 1", "Row A,A", "Row 3", "Row B" };
        _writer.WriteCsvLine(_line);
    }
Item Description
Separator Csv Seperator to use default ','
CRLFMode Determine which mode to use for new lines.<li>CR + LF ? Used as a new line character in Windows.</li><li>CR(Carriage Return) ? Used as a new line character in Mac OS before X.</li><li>LF(Line Feed) ? Used as a new line character in Unix/Mac OS X</li>
WriteCsvLine() Write an array of strings to the Csv Stream and escapes when nececary.
**SetColumnIndexes() Limit the output columns from the source array

CsvReader<T>

The CsvReader is a full type CSV deserializer.
All simple types are allowed to be used as property type, including byte[]. All other complex types are ignored.

    public class Data
    {
        [Column(Index = 0)]
        public string Column1 { get; set; }

        [Column(Index = 1)]
        public decimal Column2 { get; set; }

        [Column(Index = 2)]
        public string Column3 { get; set; }

        [Column(Index = 5)]
        public string Column5 { get; set; }
    }
    
    string file = @"D:\Temp\data.csv";

    using (CsvReader<Data> _reader = new(file))
    {        
        _reader.Culture = CultureInfo.GetCultureInfo("en-us") ;
        _reader.SkipHeader();
        var _data = _reader.ReadAsEnumerable().Where(p => p.Column1.Contains("16"));
        var _materialized = _data.ToList();
    }    

The reader is for performance reasons unrelated to the CsvStreamReader.

The reader does not care about the number of columns in a row, as long as the highest index on the Column Attribute does not exceed the number of columns in a row.
You only specify the column indexes you need.

Methods / Property Description
BufferSize Stream buffer size, Default: 1024.
Close() Close the CSV stream reader
CurrentLine Returns the current line number.
DetectSeparator() To auto set the separator (looks for commonly used separators in first 10 lines).
DetectEncodingFromByteOrderMarks indicates whether to look for byte order marks at the beginning of the file.
Dispose() Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
EmptyLineBehaviour EmptyLineBehaviour: <li>DefaultInstance: Return a new instance of T (Default)</li><li>NullValue: Return Null value for object.</li><li>SkipAndReadNext: if empty line has occurred, the reader will move to the next line.</li><li>LogError: Create an entry in Errors collecion</li><li>ThrowException: throw an exception when an empty line has occurred.</li>
Encoding The character encoding to use.
EndOfStream Returns true when end of stream is reached. Use this when you are using Read() / Skip() or partially ReadAsEnumerable()
Errors Returns a list of errors when HasErrors returned true
Flush() Flushes all buffers.
HasErrors Indicates there are errors
IgnoreColumnAttributes All properties are handled in order of property occurrence and mapped directly to their respective index. Only use when CsvWriter has this set to true as well. (ColumnAttribute is ignored.)
MoveToStart() Moves the reader to the start position, Skip() and Take() alter the start positions use MoveToStart() to reset the position.
Open() Opens the Reader. (this method is optional, the reader will auto open when nececary)
Read() Reads current row into T and advances the reader to the next row.
ReadCsvLine() Reads current row into a string[] just like the CsvStreamReader. and advances the reader to the next row.
ReadHeader() Moves to start and performs a ReadCsvLine()
ReadAsEnumerable() Reads and deserializes each csv file line per iteration in the collection, this allows for querying large size files. It starts from the current position, if you used Skip(), Read() or SkipHeader() the current position is determined by those methods.
Separator Set the separator to use (default ',')
Skip(int rows) Skip and advances the reader to the next row without interpreting it. This is much faster then IEnumerable.Skip().
SkipHeader() Ensures stream is at start then skips the first row.

The path given to the constructor can be a specific file or directory, in case a directory is given, the filename will be expected as [TYPENAME].csv, or based on the FileName given by the CsvAttribute.

(Skip does not deserialize, that's why it's faster then normal IEnumerable operations).

CsvWriter<T>

The CsvReader is a full type CSV serializer.
All simple types are allowed to be used as property type, including byte[]. All other complex types are ignored.

    public class Data
    {
        [Column(Index = 0)]
        public string Column1 { get; set; }

        [Column(Index = 1)]
        public decimal Column2 { get; set; }

        [Column(Index = 2)]
        public string Column3 { get; set; }

        [Column(Index = 5)]
        public string Column5 { get; set; }
    }


    private IEnumerable<CsvSimple> LargeData()
    {
        for (int ii = 0; ii < 10000; ii++)
        {
            Data _line = new()
            {
                Column1 = "bij",
                Column2 = 109.59M,
                Column3 = "test",
                Column5 = $"{ii}",
                
            };
            yield return _line;
        }
    }
    
    
    string file = @"D:\largedata.csv";
    using (CsvWriter<CsvSimple> _writer = new(file) { Separator = ',', Append = true })
    {
        _writer.Culture = CultureInfo.GetCultureInfo("en-us");
        _writer.Write(LargeData());
    }
      

Methods / Properties:

Item Description
FileMode Determine whether to create a new file or append to existing files.
Open() Opens the Writer. (this method is optional, the writer will auto open when nececary)
WriteHeader() Write header with property names of T.
Write(IEnumerable<T> rows) Writes data to Csv while consuming rows.
Flush() Flushes all buffers.
IgnoreColumnAttributes All properties are handled in order of property occurrence and mapped directly to their respective index. (ColumnAttribute is ignored.)
Separator Set the separator to use (default ',')
CRLFMode Determine which mode to use for new lines.<li>CR + LF → Used as a new line character in Windows.</li><li>CR(Carriage Return) → Used as a new line character in Mac OS before X.</li><li>LF(Line Feed) → Used as a new line character in Unix/Mac OS X</li>
NullValueBehaviour Determine what to do with writing null objects.<li>Skip, Ignore the object</li><li>Empty Line, Write an empty line</li>
Culture Sets the default Culture for decimal / double conversions etc. For more complex conversions use the ICustomCsvParse interface.
Encoding The character encoding to use.

The path given to the constructor can be a specific file or directory, in case a directory is given, the filename will be generated based on the T typename, or based on the FileName given by the CsvAttribute.

The writer is for performance reasons unrelated to the CsvStreamWriter.

ColumnAttribute

The column attribute defines the properties to be serialized or deserialized.

Item Description
Index Defines the index position within the CSV document. Numbers can be skipped for the reader to ignore certain columns, for the writer numbers can also be skipped which leads to empty columns.
Header Defines the header text, this property only applies to the CsvWriter, if not specified, the property name is used.
OutputFormat Apply a string format, depending on the Property type. This property is for CsvWriter only.
OutputNullValue Defines the value to write as a default for null, This property is for CsvWriter only.
CustomParserType CustomParserType allows for custom parsing of values to a specific type.

CustomParserType

CustomParserType allows the Reader<T> and Writer<T> to use a custom parsing for a specific field.

In general you can use the Column attribute on any simple type and the value will be converted if possible. When using the reader you might have csv's from third party sources where columns might require some extra conversion, this is where Custom parsers come in handy.

Custom Parsers will run as singleton per specified column in the specific Reader<T> or Writer<T>.

All values and characters at this point are unescaped / escaped as required by the CSV standards.

Interface Method Description
Read This function is called when using CsvReader</br> Return value must be the same as the property type the CustomParser is placed on.
Reading This method is called when using CsvReader. It can be used as a support function to the Read function when reading per char might be a performance requirement.</br> don't implement this interface when you don't need it.</br>c is the character to use in the result text and should be appended to the value StringBuilder, escaping has already been done at this point.
Write This function is called when using CsvWriter</br> T value must be the same as the property type the CustomParser is placed on.

    public sealed class ParseBoolean : ICustomCsvParse<bool?>
    {
        // for CsvReader
        public bool? Read(StringBuilder value)
        {
            switch (value.ToString().ToLower())
            {
                case "on":
                case "true":
                case "yes":
                case "1":
                    return true;
                case "off":
                case "false":
                case "no":
                case "0":
                    return false;
            }
            return null;
        }

        // this is the default implementation for Reading method in order to work. you can normally leave this out.
        public void Reading(StringBuilder value, int line, int colIndex, long readerPos, int linePos, int colPos, char c)  
        {
            value.Append(c);
        }

        // for CsvWriter
        public string Write(bool? value)
        {
            if (value.HasValue)
            {
                if (value == true)
                {
                    return "1";
                }
                return "0";
            }
            return string.Empty;
        }
    }


    public class ParseDecimal : ICustomCsvParse<Decimal>
    {
        private CultureInfo _culture;

        public ParseDecimal()
        {
            _culture = CultureInfo.GetCultureInfo("en-us");
        }

        public Decimal Read(StringBuilder value) => Decimal.Parse(value.ToString(), _culture);

        public string Write(Decimal value) => value.ToString(_culture);
    }


    public sealed class CsvPreParseTestObject
    {
        [Column(Index = 0, CustomParserType = typeof(ParseBoolean) )]
        public Boolean? IsOk { get; set; }

        [Column(Index = 1)]
        public string Name { get; set; }

        [Column(Index = 3, CustomParserType = typeof(ParseDecimal))]
        public Decimal Price { get; set; }
    }


    using (var _reader = new CsvReader<CsvPreParseTestObject>(_file))
    {
        _reader.Skip(); // Slip header.
        var _rows = _reader.ReadAsEnumerable().ToArray(); // Materialize.
    }

CsvAttribute

the CsvAttribute can be set at defaults for CustomParserType, these CustomParserTypes will be applied to all properties of that specific type.
until they are overruled at property level.

Item Description
FileName When a directory is specified on the constructor for CsvReader or CsvWriter instead of a file, this FileName will be used within specified directory.
DefaultCustomParserTypeString Default Custom Parse Type
DefaultCustomParserTypeGuid Default Custom Parse Type
DefaultCustomParserTypeBoolean Default Custom Parse Type
DefaultCustomParserTypeDateTime Default Custom Parse Type
DefaultCustomParserTypeDateTimeOffset Default Custom Parse Type
DefaultCustomParserTypeTimeSpan Default Custom Parse Type
DefaultCustomParserTypeByte Default Custom Parse Type
DefaultCustomParserTypeSByte Default Custom Parse Type
DefaultCustomParserTypeInt16 Default Custom Parse Type
DefaultCustomParserTypeInt32 Default Custom Parse Type
DefaultCustomParserTypeInt64 Default Custom Parse Type
DefaultCustomParserTypeSingle Default Custom Parse Type
DefaultCustomParserTypeDecimal Default Custom Parse Type
DefaultCustomParserTypeDouble Default Custom Parse Type
DefaultCustomParserTypeUInt16 Default Custom Parse Type
DefaultCustomParserTypeUInt32 Default Custom Parse Type
DefaultCustomParserTypeUInt64 Default Custom Parse Type
DefaultCustomParserTypeBigInteger Default Custom Parse Type
DefaultCustomParserTypeGuidNullable Default Custom Parse Type
DefaultCustomParserTypeBooleanNullable Default Custom Parse Type
DefaultCustomParserTypeDateTimeNullable Default Custom Parse Type
DefaultCustomParserTypeDateTimeOffsetNullable Default Custom Parse Type
DefaultCustomParserTypeTimeSpanNullable Default Custom Parse Type
DefaultCustomParserTypeByteNullable Default Custom Parse Type
DefaultCustomParserTypeSByteNullable Default Custom Parse Type
DefaultCustomParserTypeInt16Nullable Default Custom Parse Type
DefaultCustomParserTypeInt32Nullable Default Custom Parse Type
DefaultCustomParserTypeInt64Nullable Default Custom Parse Type
DefaultCustomParserTypeSingleNullable Default Custom Parse Type
DefaultCustomParserTypeDecimalNullable Default Custom Parse Type
DefaultCustomParserTypeDoubleNullable Default Custom Parse Type
DefaultCustomParserTypeUInt16Nullable Default Custom Parse Type
DefaultCustomParserTypeUInt32Nullable Default Custom Parse Type
DefaultCustomParserTypeUInt64Nullable Default Custom Parse Type
DefaultCustomParserTypeBigIntegerNullable Default Custom Parse Type

    public class Parsestring : ICustomCsvParse<string>
    {
        public string Read(StringBuilder value)
        {
            return value.ToString();
        }
        public string Write(string value)
        {
            return value;
        }
    }

    [Csv( DefaultCustomParserTypeString = typeof(Parsestring))]
    public class CsvAllTypes
    {
        [Column(Index = 0, OutputFormat = "", OutputNullValue = "")]
        public string _stringValue { get; set; }

        [Column(Index = 35, OutputFormat = "", OutputNullValue = "")]
        public string _stringValue2 { get; set; }

        [Column(Index = 1, CustomParserType = typeof(ParseGuid), OutputFormat = "", OutputNullValue = "")]
        public Guid _GuidValue { get; set; }
   }

Sampling only a few rows without reading entire csv.


    List<CsvSimple> _result1;
    List<CsvSimple> _result2;

    string file = @"D:\largedata.csv";
    _w.Start();

    using (CsvReader<CsvSimple> _reader = new CsvReader<CsvSimple>(file))
    {
        _reader.Skip(); // skip the Header row.

        // Materializes 20 records but returns 10.
        _result1 = _reader.ReadAsEnumerable().Skip(10).Take(10).ToList(); 
        
        // Materialize only 10 records.
        _reader.Skip(10);
        _result1 = _reader.ReadAsEnumerable().Take(10).ToList();

    }

Mind you on the fact that Skip and Take andvances the reader to the next position.
executing another _reader.ReadAsEnumerable().Where(p ⇒ p...).ToList() will Query from position 21.

Use MoveToStart() to move the reader to the starting position.

_reader.Skip() is different then _reader.ReadAsEnumerable().Skip() as the first does not materialize to T and is faster.

Serialize / Deserialize plain C# objects without specific ColumnAttributes

Mapping will be determined by the Header in the Csv, columns will only be mapped to corresponding property names.

    public class SimpleObject
    {
        public int Id { get; set; }
        public string Field1 { get; set; }
        public string Field2 { get; set; }
    }
    private IEnumerable<SimpleObject> Data(int count = 50)
    {
        for (int ii = 0; ii < count; ii++)
        {
            yield return  new SimpleObject() { Id = ii, Field1 = $"A{ii}", Field2 = $"b{ii}" };                
        }
    }
    string _file = System.IO.Path.GetTempFileName();

    using (CsvWriter<SimpleObject> _writer = new(_file) { Separator = ',' })
    {
        _writer.WriteHeader();
        _writer.Write(Data());
    }

    using (CsvReader<SimpleObject> _reader = new(_file))
    {
        List<SimpleObject> _materialized = _reader.ReadAsEnumerable().ToList();
    }

CsvDataTypeObject

Convenience class to read up to 50 CsvColumns from a Csv document.

  • All fields are string only.
  • this object can be usefull if you want to use the CsvReader<T> on unknown csv files.
  • Object can be used on any Csv regardless their number of columns. (Column indexes above 50 will be ignored)
  • Object has the Deconstruct implemented so you can use shorthands for fields.
  • Object is comparable by value.
  • Implicit convert from Csv line string to CsvDataTypeObject and back to string.
    using (CsvReader<CsvDataTypeObject> _reader = new(_file))
    {
        foreach (var item in _reader.ReadAsEnumerable())
        {
            string id = item.Field01;
            string name = item.Field02;
        }
    }

you can use the Deconstruct shorthand:

    using (CsvReader<CsvDataTypeObject> _reader = new(_file))
    {
        foreach (var (id, name) in _reader.ReadAsEnumerable())
        {            
        }
    }

if you would like to use it with the writer you can limit the number of output columns with the ColumnLimit property.

    string _file = System.IO.Path.GetTempFileName();

    using (CsvWriter<CsvDataTypeObject> _writer = new(_file) { Separator = ',', ColumnLimit = 5 })
    {
        _writer.WriteHeader();
        _writer.Write(SimpleData(50));
    }

    private IEnumerable<CsvDataTypeObject> SimpleData(int count)
    {
        for (int ii = 0; ii < count; ii++)
        {
            yield return new CsvDataTypeObject() { Field01 = $"A{ii}", Field02 = $"b{ii}", Field03 = $"c{ii}", Field04 = $"d{ii}", Field05 = $"e{ii}" };
        }
    }

DataTable Import / Export

2 Extension methods on the DataTable object.


    using DevToys.PocoCsv.Core.Extensions;

    // Import
    var _file = @"C:\data.csv";
    var _table = new DataTable();
    _table.ImportCsv(_file, ',', true);

    // Export
    _file = @"C:\data2.csv";
    _table.ExportCsv(_file, ',');

Product Compatible and additional computed target framework versions.
.NET net5.0 is compatible.  net5.0-windows was computed.  net5.0-windows7.0 is compatible.  net6.0 is compatible.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net6.0-windows7.0 is compatible.  net7.0 is compatible.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net7.0-windows7.0 is compatible.  net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-maccatalyst18.0 is compatible.  net8.0-macos was computed.  net8.0-macos15.0 is compatible.  net8.0-tvos was computed.  net8.0-windows was computed.  net8.0-windows7.0 is compatible.  net9.0 is compatible.  net9.0-maccatalyst18.0 is compatible.  net9.0-macos15.0 is compatible.  net9.0-windows7.0 is compatible. 
.NET Core netcoreapp3.0 is compatible.  netcoreapp3.1 is compatible. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.
  • .NETCoreApp 3.0

    • No dependencies.
  • .NETCoreApp 3.1

    • No dependencies.
  • net5.0

    • No dependencies.
  • net5.0-windows7.0

    • No dependencies.
  • net6.0

    • No dependencies.
  • net6.0-windows7.0

    • No dependencies.
  • net7.0

    • No dependencies.
  • net7.0-windows7.0

    • No dependencies.
  • net8.0

    • No dependencies.
  • net8.0-maccatalyst18.0

    • No dependencies.
  • net8.0-macos15.0

    • No dependencies.
  • net8.0-windows7.0

    • No dependencies.
  • net9.0

    • No dependencies.
  • net9.0-maccatalyst18.0

    • No dependencies.
  • net9.0-macos15.0

    • No dependencies.
  • net9.0-windows7.0

    • No dependencies.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last updated
4.5.3 0 12/26/2024
4.5.2 70 12/18/2024
4.5.1 73 12/16/2024
4.5.0 72 12/16/2024
4.4.1 51 12/16/2024
4.4.0 91 12/14/2024
4.3.2 108 12/3/2024
4.3.1 88 11/22/2024
4.3.0 83 11/21/2024
4.2.5 84 11/20/2024
4.2.4 84 11/19/2024
4.2.3 99 11/13/2024
4.2.2 165 2/28/2024
4.2.1 121 2/24/2024
4.2.0 133 2/23/2024
4.1.2 108 2/22/2024
4.1.1 138 2/21/2024
4.1.0 131 2/21/2024
4.0.1 150 2/12/2024
4.0.0 136 2/12/2024
3.1.13 115 2/8/2024
3.1.12 155 2/7/2024
3.1.11 111 1/31/2024
3.1.10 122 1/19/2024
3.1.9 127 1/13/2024
3.1.8 127 1/12/2024
3.1.7 114 1/11/2024
3.1.5 138 1/8/2024
3.1.3 180 12/1/2023
3.1.2 140 12/1/2023
3.1.0 125 11/28/2023
3.0.7 213 8/27/2023
3.0.6 155 8/23/2023
3.0.5 165 8/23/2023
3.0.4 164 8/17/2023
3.0.3 180 8/15/2023
3.0.2 180 8/11/2023
3.0.1 199 8/11/2023
3.0.0 177 8/11/2023
2.0.7 224 8/9/2023
2.0.5 185 8/4/2023
2.0.4 184 8/3/2023
2.0.3 154 7/31/2023
2.0.2 181 7/28/2023
2.0.0 182 7/19/2023
1.7.53 221 4/14/2023
1.7.52 219 4/12/2023
1.7.51 206 4/7/2023
1.7.43 236 4/3/2023
1.7.42 218 4/3/2023
1.7.41 202 4/3/2023
1.7.5 207 4/7/2023
1.7.3 247 4/3/2023
1.7.2 235 4/3/2023
1.7.1 225 4/3/2023
1.7.0 233 4/1/2023
1.6.3 230 3/31/2023
1.6.2 232 3/29/2023
1.6.1 225 3/29/2023
1.6.0 221 3/27/2023
1.5.8 244 3/24/2023
1.5.7 216 3/22/2023
1.5.6 231 3/22/2023
1.5.5 240 3/21/2023
1.5.4 249 3/21/2023
1.5.1 238 3/20/2023
1.5.0 243 3/19/2023
1.4.5 239 3/18/2023
1.4.4 278 3/18/2023
1.4.3 234 3/18/2023
1.4.2 250 3/18/2023
1.4.1 216 3/18/2023
1.4.0 234 3/18/2023
1.3.92 245 3/18/2023
1.3.91 250 3/17/2023
1.3.9 237 3/17/2023
1.3.8 215 3/17/2023
1.3.7 244 3/17/2023
1.3.6 209 3/17/2023
1.3.5 226 3/17/2023
1.3.4 248 3/17/2023
1.3.3 237 3/16/2023
1.3.2 218 3/16/2023
1.3.1 245 3/16/2023
1.3.0 201 3/16/2023
1.2.0 239 3/14/2023
1.1.6 279 2/24/2023
1.1.5 324 2/16/2023
1.1.4 487 5/18/2022
1.1.3 723 1/27/2022
1.1.2 651 1/27/2022
1.1.1 704 1/14/2022
1.1.0 5,849 11/23/2021
1.0.5 400 5/11/2021
1.0.4 345 4/14/2021
1.0.3 386 4/12/2021
1.0.2 342 4/12/2021
1.0.1 323 4/7/2021
1.0.0 395 4/7/2021

4.5.3
- Bugfix: nullable value combined with CustomParser.

4.5.2
- Added SetColumnIndexes to CsvStreamWriter as well.

4.5.1
- Minor improvements on CsvStreamReader.ReadCsvLineAsDictionary()

4.5
- Added ability to use only ICustomCsvParse with the Reading method.
- Added CustomBooleanParserNullable and CustomBooleanParser to CsvReader<T>.
- Added CustomLowerCaseParser and CustomUpperCaseParser to be used with CsvReader<T>.
- Added ReadCsvLineAsDictionary() and ReadAsEnumerableDictionary() to CsvStreamReader.

4.4.1
- Added selectIndexes to CsvReader<T>.ReadCsvLine()

4.4
- Added SetColumnIndexes, ResetColumnIndexes to the CsvStreamReader, now you are able to limit the number of columns in the result array.

4.3.2
- Added string[] decontructor extensions to use with CsvStreamReader.ReadAsEnumerable().
- Bux fix, exceptions not thrown correctly and clearer messages.

4.3.1
- Added FileName to CsvAttribute

4.3
- Added support for Net9.0
- Open() command is optional for CsvReader and CsvWriter
- Added FileMode to CsvWriter, either overwrite or append to the file, the default is overwrite.

4.2.5
- Minor bug fix error message when not correctly opened Writer

4.2.4
- file can be a directory as well, in case of directory the filename will be based on typename T.

4.2.2
- Minor improvements on CsvDataTypeObject

4.2.1
- Dropped: CsvDataTypeObject5, CsvDataTypeObject10, CsvDataTypeObject25, CsvDataTypeObject50, CsvDataTypeObject100
- Introduced: CsvDataTypeObject, this object is the 50 column version.
- Added ColumnLimit to CsvWriter, this can be used to limit the output columns, this can be used in combination with CsvDataTypeObject. Default = 0 (No Limit).

4.2.0
- Added new objects: CsvDataTypeObject5, CsvDataTypeObject10, CsvDataTypeObject25, CsvDataTypeObject50, CsvDataTypeObject100 ( see Readme for more info ).
- Added ReadHeader() to CsvReader
- Added ReadCsvLine() to CsvReader

4.1.0
- Added new feature: Serialize / Deserialize plain C# objects without specific ColumnAttributes

4.0
- Bug fix regarding different encodings in CsvReader and CsvStreamReader, for previous version only UTF8 worked properly.
- Last(x) had to be removed.

3.1.11
- CsvWriter performance improvements.
- CsvStreamWriter performance improvements.

3.1.10
- Minor adjustments

3.1.9
- Changed how the ICustomCsvParser.Reading method works, now has default implementation so is not nececary to implement.
When implemented, c should be at least appended to value ( value.Append(c); ).

3.1.7
-  Added Reading method to ICustomCsvParser

3.1.5
- Added IgnoreColumnAttributes for CsvReader and CsvWriter.
All properties are handled in order of property occurrence and mapped directly to their respective index. (ColumnAttribute is ignored.)

3.1.4
- Added .net 8 support.

3.0.6
- Small refactorings

3.0.4
- CsvWriter CustomParser for strings.
- Exceptions to Errors log when using ICustomCsvParser

3.0.3
- Added SkipAndReadNext option to EmptyLineBehaviour for CsvReader. this will ignore empty lines altogether
- Added LogError and ThrowException to EmptyLineBehaviour.
- Bugfix: not serializing/deserializing enums
- Throw Error on not supported property types.

3.0.0
- CsvStreamReader and CsvReader<T> Performance +10%
- BugFix CsvWriter: Flush on Close().

2.0.6
- Small improvements.

2.0.5
- Small improvements.

2.0.4
- Critical Bugfix: Escaped separator not correctly handled.
- Small performance improvements.

2.0.3
- Bugfix: Deserialize with lesser column count then in CSV.

2.0.2
- Bugfix: not properly reading escaped double quotes.
- Minor improvements

2.0
- Improved CsvWriter<T> speed.
- Extended ICustomCsvParser<T> to be supported by the CsvWriter<T> as well.
- ICustomCsvParser<T>.Parse() has been removed.
- Added Read() and Write() to ICustomCsvParser<T>
- Refactored CsvReader<T> and  CsvWriter<T>
- Introduced CsvAttribute to set, at this attribute defaults for ICustomCsvParser can be set at class level.

1.7.53
- Improved CsvStreamReader speed.
- Added ReadAsEnumerable() to CsvStreamReader.

1.7.51
- Added DataTable extensions ImportCsv / ExportCsv

1.7.1
- Changed ICustomCsvParse to generic ICustomCsvParse

1.7
- Added CustomParserType to ColumnAttribute

1.6.3
- Added NullValueBehaviour to CsvWriter<T>
- Added CurrentLine to Reader
- Added LineNumber to Error log
- Added Flush() to Reader<T> and Writer<T>
- Refactored UnitTests in GitHub code Demo Tests and Validate Tests.

1.6.2
- Minor bugfix with CR only ending.

1.6.1
- Fixed bug with AutoDetectSeparator.
- Added EmptyLineBehaviour to CsvReader<T>
- Refactoring

1.6.0
- Added Last(int rows) function to Reader<T>.
- Added IEnumerable<CsvReadError> Errors to CsvReader<T>.
-Fixed Skip() counter.
- Correct handling for CRLF in CsvStreamReader and CsvReader<T>
   -  \r = CR(Carriage Return) → Used as a new line character in Mac OS before X
   -  \n = LF(Line Feed) → Used as a new line character in Unix/Mac OS X
   -  \r\n = CR + LF → Used as a new line character in Windows
- Added  CRLFMode to CsvStreamWriter and CsvWriter<T>

1.5.8
- Minor Improvements
- Added Skip() to CsvStreamReader
- Changed EndOfStream behaviour

1.5.7
- Small improvements

1.5.1
- Updated Readme
- Fixed bug with Skip(rows)
- Fixed small bug with ReadAsEnumerable() always started at position 0.

1.5
- Correct handling Null Types for Reader

1.4.5
- Refactoring
- Removed DynamicReader and DynamicWriter

1.4.2
- Another performance improvement for Reader

1.4
- Performance improvements for Writer.
- Added OutputFormat ro ColumnAttribute

1.3.8
- Performance improvement for Reader

1.3.2
- Bug fixes

1.3
- Improved constructors to support all parameters for underlying StreamReader and StreamWriters.
- Added Skip() to CsvReader (to be used in combination Read())
- Added WriteHeader() to CsvWriter()
- Added Header to Column attribute to be used by the CsvWriter
- GetCsvSeparator() / DetectSeparator(),detects more exotic separators.
- Added byte[] to base64 serialization to CsvReader and CsvWriter

1.2
- Added single Read() function.
- Rows() now marked as obsolete.
- Added ReadAsEnumerable() as replacement for Rows()
- Added GetCsvSeparator(int sampleRows) to CsvStreamReader()
- Added DetectSeparator() to CsvReader()

1.1.5
- Bug Fixes

1.1.4
- Added CsvUtils static class including some special Csv functions to use.

1.1.3
- Added CsvWriterDynamic

1.1.1
- Added CsvReaderDynamic

1.1.0
- Speed optimizations (using delegates instead of reflection)

1.0.5
- Read/Write Stream csv lines into a poco object.
- Query / Read / Write large csv files.