DevToys.PocoCsv.Core
1.7.5
See the version list below for details.
dotnet add package DevToys.PocoCsv.Core --version 1.7.5
NuGet\Install-Package DevToys.PocoCsv.Core -Version 1.7.5
<PackageReference Include="DevToys.PocoCsv.Core" Version="1.7.5" />
paket add DevToys.PocoCsv.Core --version 1.7.5
#r "nuget: DevToys.PocoCsv.Core, 1.7.5"
// Install DevToys.PocoCsv.Core as a Cake Addin #addin nuget:?package=DevToys.PocoCsv.Core&version=1.7.5 // Install DevToys.PocoCsv.Core as a Cake Tool #tool nuget:?package=DevToys.PocoCsv.Core&version=1.7.5
DevToys.PocoCsv.Core
One of the fastest csv reader deserialzer available.
DevToys.PocoCsv.Core is a class library to read and write to Csv. It contains CsvStreamReader, CsvStreamWriter and Serialization classes CsvReader<T> and CsvWriter<T>.
Read/write serialize/deserialize data to and from Csv.
- Conform RFC 4180.
- Auto separator detection.
- Auto line feed/break detection.
- Sequential read with ReadAsEnumerable().
- Csv schema Retrieval with CsvUtils.GetCsvSchema().
- Casting Error log.
CsvStreamReader
string file = "C:\Temp\data.csv";
using (CsvStreamReader _reader = new CsvStreamReader(file))
{
_reader.Separator = ',';
while (!_reader.EndOfStream)
{
List<string> _values = _reader.ReadCsvLine().ToList();
}
}
CsvStreamWriter
string file = @"D:\Temp\test.csv";
using (CsvStreamWriter _writer = new CsvStreamWriter(file))
{
var _line = new string[] { "Row 1", "Row A,A", "Row 3", "Row B" };
_writer.WriteCsvLine(_line);
}
CsvReader<T>
public class Data
{
[Column(Index = 0)]
public string Column1 { get; set; }
[Column(Index = 1)]
public string Column2 { get; set; }
[Column(Index = 2)]
public string Column3 { get; set; }
[Column(Index = 5)]
public string Column5 { get; set; }
}
string file = @"D:\Temp\data.csv");
using (CsvReader<Data> _reader = new(file))
{
_reader.Open();
_reader.Separator = ','; // or use _reader.DetectSeparator();
var _data = Reader.ReadAsEnumerable().Where(p => p.Column1.Contains("16"));
var _materialized = _data.ToList();
}
- Open()
Opens the Reader. - Separator
Set the separator to use (default ','); - ReadAsEnumerable()
Reads and deserializes each csv file line per iteration in the collection, this allows for querying mega sized files. - DetectSeparator()
To auto set the separator (looks for commonly used separators in first 10 lines). - Skip(int rows)
Skip and advances the reader to the next row without interpreting it. This is much faster then IEnumerable.Skip(). - Last(int rows)
Last seeks the csv document for the last x entries. this is much faster then IEnumerable.Last(). - Read()
Reads current row into T and advances the reader to the next row. - MoveToStart()
Moves the reader to the start position, Skip() and Take() alter the start positions use MoveToStart() to reset the position. - EmptyLineBehaviour
- EmptyLineBehaviour: Return a new instance of T (Default)
- NullValue: Return Null value for object.
- CurrentLine
Return the current line number. - Flush() Flushes all buffers.
CsvWriter<T>
private IEnumerable<CsvSimple> LargeData()
{
for (int ii = 0; ii < 10000000; ii++)
{
CsvSimple _line = new()
{
AfBij = "bij",
Bedrag = "100",
Code = "test",
Datum = "20200203",
Mededelingen = $"test {ii}",
Rekening = "3434",
Tegenrekening = "3423424"
};
yield return _line;
}
}
string file = @"D:\largedata.csv";
using (CsvWriter<CsvSimple> _writer = new(file) { Separator = ',', Append = true })
{
_writer.Open();
_writer.Write(LargeData());
}
- Open()
Opens the Writer. - Separator
Set the separator to use (default ','); - WriteHeader()
Write header with property names of T. - Write(IEnumerable<T> rows)
Writes data to Csv while consuming rows. - CRLFMode
Determine which mode to use for new lines.- CR + LF → Used as a new line character in Windows.
- CR(Carriage Return) → Used as a new line character in Mac OS before X.
- LF(Line Feed) → Used as a new line character in Unix/Mac OS X
- NullValueBehaviour
Determine what to do with writing null objects.
- Skip, Ignore the object
- Empty Line, Write an empty line
- Flush() Flushes all buffers.
ColumnAttribute
The column attribute defines the properties to be serialized or deserialized.
- Index
Defines the index position within the CSV document. Numbers can be skipped for the reader to ignore certain columns, for the writer numbers can also be skipped which leads to empty columns. - Header
Defines the header text, this property only applies to the CsvWriter, if not specified, the property name is used. - OutputFormat
Apply a string format, depending on the Property type. This property is for CsvWriter only. - OutputNullValue
Defines the value to write as a default for null, This property is for CsvWriter only. - CustomParserType
CustomParserType allows for custom parsing of values to a specific type.
CustomParserType
CustomParserType allows the Reader to use a custom parsing for a specific field.
public sealed class ParseBoolean : ICustomCsvParse<bool?>
{
public bool? Parse(StringBuilder value)
{
switch (value.ToString().ToLower())
{
case "on":
case "true":
case "yes":
case "1":
return true;
case "off":
case "false":
case "no":
case "0":
return false;
}
return null;
}
}
public sealed class CsvPreParseTestObject
{
[Column(Index = 0, CustomParserType = typeof(ParseBoolean) )]
public Boolean? IsOk { get; set; }
[Column(Index = 1)]
public string Name { get; set; }
}
using (var _reader = new CsvReader<CsvPreParseTestObject>(_file))
{
_reader.Open();
_reader.Skip(); // Slip header.
var _rows = _reader.ReadAsEnumerable().ToArray(); // Materialize.
}
Custom Parsers will run as singleton per specified column in the specific Reader<T>.
Other Examples
public class Data
{
[Column(Index = 0)]
public string Collumn1 { get; set; }
[Column(Index = 1)]
public string Collumn2 { get; set; }
[Column(Index = 2, Header = "Test" )]
public byte[] Collumn3 { get; set; }
[Column(Index = 3)]
public DateTime TestDateTime { get; set; }
[Column(Index = 4)]
public DateTime? TestDateTimeNull { get; set; }
[Column(Index = 5)]
public Int32 TestInt { get; set; }
[Column(Index = 6, OutputNullValue = "[NULL]")]
public Int32? TestIntNull { get; set; }
}
private IEnumerable<Data> GetTestData()
{
yield return new Data
{
Collumn1 = "01",
Collumn2 = "AA",
Collumn3 = new byte[3] { 2, 4, 6 },
TestDateTime = DateTime.Now,
TestDateTimeNull = DateTime.Now,
TestInt = 100,
TestIntNull = 200
};
yield return new Data
{
Collumn1 = "01",
Collumn2 = "AA",
Collumn3 = new byte[3] { 2, 4, 6 },
TestDateTime = DateTime.Now,
TestDateTimeNull = DateTime.Now,
TestInt = 100,
TestIntNull = 200
};
yield return new Data
{
Collumn1 = "04",
Collumn2 = "BB",
Collumn3 = new byte[3] { 8, 9, 10 },
TestDateTime = DateTime.Now,
TestDateTimeNull = null,
TestInt = 300,
TestIntNull = null
};
}
public static string StreamToString(Stream stream)
{
using (StreamReader reader = new StreamReader(stream, Encoding.UTF8))
{
stream.Position = 0;
return reader.ReadToEnd();
}
}
List<Data> _result = new List<Data>();
using (MemoryStream _stream = new MemoryStream())
{
using (CsvWriter<Data> _csvWriter = new CsvWriter<Data>(_stream))
using (CsvReader<Data> _csvReader = new CsvReader<Data>(_stream))
{
_csvWriter.Separator = ';';
_csvWriter.Open();
_csvWriter.WriteHeader();
_csvWriter.Write(GetTestData());
_csvReader.Open();
_csvReader.DetectSeparator(); // Auto detect separator.
_csvReader.Skip(); // Skip header.
_result = _csvReader.ReadAsEnumerable().Where(p => p.Collumn2 == "AA").ToList();
}
}
string _result;
using (MemoryStream _stream = new MemoryStream())
{
using (CsvWriter<Data> _csvWriter = new CsvWriter<Data>(_stream))
{
_csvWriter.Separator = ',';
_csvWriter.Open();
_csvWriter.WriteHeader();
_csvWriter.Write(GetTestData());
_result = StreamToString(_stream);
}
}
Sampling only a few rows without reading entire csv.
List<CsvSimple> _result1;
List<CsvSimple> _result2;
string file = @"D:\largedata.csv";
_w.Start();
using (CsvReader<CsvSimple> _reader = new CsvReader<CsvSimple>(file))
{
_reader.Open();
_reader.Skip(); // skip the Header row.
// Materializes 20 records but returns 10.
_result1 = _reader.ReadAsEnumerable().Skip(10).Take(10).ToList();
// Materialize only 10 records.
_reader.Skip(10);
_result1 = _reader.ReadAsEnumerable().Take(10).ToList();
// Take last 10 records.
_result1 = _reader.Last(10).ToList();
}
Mind you on the fact that Skip and Take andvances the reader to the next position.
executing another _reader.ReadAsEnumerable().Where(p ⇒ p...).ToList() will Query from position 21.
Use MoveToStart() to move the reader to the starting position.
_reader.Skip() is different then _reader.ReadAsEnumerable().Skip() as the first does not materialize to T and is faster.
DataTable Import / Export
// Import
var _file = @"C:\data.csv";
var _table = new DataTable();
_table.ImportCsv(_file, ',', true);
// Export
_file = @"C:\data2.csv";
_table.ExportCsv(_file, ',');
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 is compatible. net5.0-windows was computed. net6.0 is compatible. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 is compatible. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. |
.NET Core | netcoreapp3.0 is compatible. netcoreapp3.1 is compatible. |
-
.NETCoreApp 3.0
- No dependencies.
-
.NETCoreApp 3.1
- No dependencies.
-
net5.0
- No dependencies.
-
net6.0
- No dependencies.
-
net7.0
- No dependencies.
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
Version | Downloads | Last updated |
---|---|---|
4.5.3 | 0 | 12/26/2024 |
4.5.2 | 70 | 12/18/2024 |
4.5.1 | 73 | 12/16/2024 |
4.5.0 | 72 | 12/16/2024 |
4.4.1 | 51 | 12/16/2024 |
4.4.0 | 91 | 12/14/2024 |
4.3.2 | 108 | 12/3/2024 |
4.3.1 | 88 | 11/22/2024 |
4.3.0 | 83 | 11/21/2024 |
4.2.5 | 84 | 11/20/2024 |
4.2.4 | 84 | 11/19/2024 |
4.2.3 | 99 | 11/13/2024 |
4.2.2 | 165 | 2/28/2024 |
4.2.1 | 121 | 2/24/2024 |
4.2.0 | 133 | 2/23/2024 |
4.1.2 | 108 | 2/22/2024 |
4.1.1 | 138 | 2/21/2024 |
4.1.0 | 131 | 2/21/2024 |
4.0.1 | 150 | 2/12/2024 |
4.0.0 | 136 | 2/12/2024 |
3.1.13 | 115 | 2/8/2024 |
3.1.12 | 155 | 2/7/2024 |
3.1.11 | 111 | 1/31/2024 |
3.1.10 | 122 | 1/19/2024 |
3.1.9 | 127 | 1/13/2024 |
3.1.8 | 127 | 1/12/2024 |
3.1.7 | 114 | 1/11/2024 |
3.1.5 | 138 | 1/8/2024 |
3.1.3 | 180 | 12/1/2023 |
3.1.2 | 140 | 12/1/2023 |
3.1.0 | 125 | 11/28/2023 |
3.0.7 | 213 | 8/27/2023 |
3.0.6 | 155 | 8/23/2023 |
3.0.5 | 165 | 8/23/2023 |
3.0.4 | 164 | 8/17/2023 |
3.0.3 | 180 | 8/15/2023 |
3.0.2 | 180 | 8/11/2023 |
3.0.1 | 199 | 8/11/2023 |
3.0.0 | 177 | 8/11/2023 |
2.0.7 | 224 | 8/9/2023 |
2.0.5 | 185 | 8/4/2023 |
2.0.4 | 184 | 8/3/2023 |
2.0.3 | 154 | 7/31/2023 |
2.0.2 | 181 | 7/28/2023 |
2.0.0 | 182 | 7/19/2023 |
1.7.53 | 221 | 4/14/2023 |
1.7.52 | 219 | 4/12/2023 |
1.7.51 | 206 | 4/7/2023 |
1.7.43 | 236 | 4/3/2023 |
1.7.42 | 218 | 4/3/2023 |
1.7.41 | 202 | 4/3/2023 |
1.7.5 | 207 | 4/7/2023 |
1.7.3 | 247 | 4/3/2023 |
1.7.2 | 235 | 4/3/2023 |
1.7.1 | 225 | 4/3/2023 |
1.7.0 | 233 | 4/1/2023 |
1.6.3 | 230 | 3/31/2023 |
1.6.2 | 232 | 3/29/2023 |
1.6.1 | 225 | 3/29/2023 |
1.6.0 | 221 | 3/27/2023 |
1.5.8 | 244 | 3/24/2023 |
1.5.7 | 216 | 3/22/2023 |
1.5.6 | 231 | 3/22/2023 |
1.5.5 | 240 | 3/21/2023 |
1.5.4 | 249 | 3/21/2023 |
1.5.1 | 238 | 3/20/2023 |
1.5.0 | 243 | 3/19/2023 |
1.4.5 | 239 | 3/18/2023 |
1.4.4 | 278 | 3/18/2023 |
1.4.3 | 234 | 3/18/2023 |
1.4.2 | 250 | 3/18/2023 |
1.4.1 | 216 | 3/18/2023 |
1.4.0 | 234 | 3/18/2023 |
1.3.92 | 245 | 3/18/2023 |
1.3.91 | 250 | 3/17/2023 |
1.3.9 | 237 | 3/17/2023 |
1.3.8 | 215 | 3/17/2023 |
1.3.7 | 244 | 3/17/2023 |
1.3.6 | 209 | 3/17/2023 |
1.3.5 | 226 | 3/17/2023 |
1.3.4 | 248 | 3/17/2023 |
1.3.3 | 237 | 3/16/2023 |
1.3.2 | 218 | 3/16/2023 |
1.3.1 | 245 | 3/16/2023 |
1.3.0 | 201 | 3/16/2023 |
1.2.0 | 239 | 3/14/2023 |
1.1.6 | 279 | 2/24/2023 |
1.1.5 | 324 | 2/16/2023 |
1.1.4 | 487 | 5/18/2022 |
1.1.3 | 723 | 1/27/2022 |
1.1.2 | 651 | 1/27/2022 |
1.1.1 | 704 | 1/14/2022 |
1.1.0 | 5,849 | 11/23/2021 |
1.0.5 | 400 | 5/11/2021 |
1.0.4 | 345 | 4/14/2021 |
1.0.3 | 386 | 4/12/2021 |
1.0.2 | 342 | 4/12/2021 |
1.0.1 | 323 | 4/7/2021 |
1.0.0 | 395 | 4/7/2021 |
1.7.5
- Added DataTable extensions ImportCsv / ExportCsv
1.7.1
- Changed ICustomCsvParse to generic ICustomCsvParse
1.7
- Added CustomParserType to ColumnAttribute
1.6.3
- Added NullValueBehaviour to CsvWriter<T>
- Added CurrentLine to Reader
- Added LineNumber to Error log
- Added Flush() to Reader<T> and Writer<T>
- Refactored UnitTests in GitHub code Demo Tests and Validate Tests.
1.6.2
- Minor bugfix with CR only ending.
1.6.1
- Fixed bug with AutoDetectSeparator.
- Added EmptyLineBehaviour to CsvReader<T>
- Refactoring
1.6.0
- Added Last(int rows) function to Reader<T>.
- Added IEnumerable<CsvReadError> Errors to CsvReader<T>.
-Fixed Skip() counter.
- Correct handling for CRLF in CsvStreamReader and CsvReader<T>
- \r = CR(Carriage Return) → Used as a new line character in Mac OS before X
- \n = LF(Line Feed) → Used as a new line character in Unix/Mac OS X
- \r\n = CR + LF → Used as a new line character in Windows
- Added CRLFMode to CsvStreamWriter and CsvWriter<T>
1.5.8
- Minor Improvements
- Added Skip() to CsvStreamReader
- Changed EndOfStream behaviour
1.5.7
- Small improvements
1.5.1
- Updated Readme
- Fixed bug with Skip(rows)
- Fixed small bug with ReadAsEnumerable() always started at position 0.
1.5
- Correct handling Null Types for Reader
1.4.5
- Refactoring
- Removed DynamicReader and DynamicWriter
1.4.2
- Another performance improvement for Reader
1.4
- Performance improvements for Writer.
- Added OutputFormat ro ColumnAttribute
1.3.8
- Performance improvement for Reader
1.3.2
- Bug fixes
1.3
- Improved constructors to support all parameters for underlying StreamReader and StreamWriters.
- Added Skip() to CsvReader (to be used in combination Read())
- Added WriteHeader() to CsvWriter()
- Added Header to Column attribute to be used by the CsvWriter
- GetCsvSeparator() / DetectSeparator(),detects more exotic separators.
- Added byte[] to base64 serialization to CsvReader and CsvWriter
1.2
- Added single Read() function.
- Rows() now marked as obsolete.
- Added ReadAsEnumerable() as replacement for Rows()
- Added GetCsvSeparator(int sampleRows) to CsvStreamReader()
- Added DetectSeparator() to CsvReader()
1.1.5
- Bug Fixes
1.1.4
- Added CsvUtils static class including some special Csv functions to use.
1.1.3
- Added CsvWriterDynamic
1.1.1
- Added CsvReaderDynamic
1.1.0
- Speed optimizations (using delegates instead of reflection)
1.0.5
- Read/Write Stream csv lines into a poco object.
- Query / Read / Write large csv files.