当前位置:网站首页>The simple encapsulation of SQL Server bulk insert data scheme sqlbulkcopy makes batch insertion more convenient

The simple encapsulation of SQL Server bulk insert data scheme sqlbulkcopy makes batch insertion more convenient

2020-12-08 08:51:39 NewAI

One 、Sql Server Introduction to insertion scheme

  • About SqlServer The way to batch insert , There are three common insertion methods ,InsertBatchInsertSqlBulkCopy, Let's compare the speed of the following three options

1. ordinary Insert Insert method

public static void Insert(IEnumerable<Person> persons)
{
    using (var con = new SqlConnection("Server=.;Database=DemoDataBase;User ID=sa;Password=8888;"))
    {
        con.Open();
        foreach (var person in persons)
        {
            using (var com = new SqlCommand(
                "INSERT INTO dbo.Person(Id,Name,Age,CreateTime,Sex)VALUES(@Id,@Name,@Age,@CreateTime,@Sex)",
                con))
            {
                com.Parameters.AddRange(new[]
                {
                    new SqlParameter("@Id", SqlDbType.BigInt) {Value = person.Id},
                    new SqlParameter("@Name", SqlDbType.VarChar, 64) {Value = person.Name},
                    new SqlParameter("@Age", SqlDbType.Int) {Value = person.Age},
                    new SqlParameter("@CreateTime", SqlDbType.DateTime)
                        {Value = person.CreateTime ?? (object) DBNull.Value},
                    new SqlParameter("@Sex", SqlDbType.Int) {Value = (int)person.Sex},
                });
                com.ExecuteNonQuery();
            }
        }
    }
}

2. Splicing BatchInsert Insert statement

public static void BatchInsert(Person[] persons)
{
    using (var con = new SqlConnection("Server=.;Database=DemoDataBase;User ID=sa;Password=8888;"))
    {
        con.Open();
        var pageCount = (persons.Length - 1) / 1000 + 1;
        for (int i = 0; i < pageCount; i++)
        {
            var personList = persons.Skip(i * 1000).Take(1000).ToArray();
            var values = personList.Select(p =>
                $"({p.Id},'{p.Name}',{p.Age},{(p.CreateTime.HasValue ? $"'{p.CreateTime:yyyy-MM-dd HH:mm:ss}'" : "NULL")},{(int) p.Sex})");
            var insertSql =
                $"INSERT INTO dbo.Person(Id,Name,Age,CreateTime,Sex)VALUES{string.Join(",", values)}";
            using (var com = new SqlCommand(insertSql, con))
            {
                com.ExecuteNonQuery();
            }
        }
    }
}

3.SqlBulkCopy Insert scheme

public static void BulkCopy(IEnumerable<Person> persons)
{
    using (var con = new SqlConnection("Server=.;Database=DemoDataBase;User ID=sa;Password=8888;"))
    {
        con.Open();
        var table = new DataTable();
        table.Columns.AddRange(new []
        {
            new DataColumn("Id", typeof(long)), 
            new DataColumn("Name", typeof(string)), 
            new DataColumn("Age", typeof(int)), 
            new DataColumn("CreateTime", typeof(DateTime)), 
            new DataColumn("Sex", typeof(int)), 
        });
        foreach (var p in persons)
        {
            table.Rows.Add(new object[] {p.Id, p.Name, p.Age, p.CreateTime, (int) p.Sex});
        }

        using (var copy = new SqlBulkCopy(con))
        {
            copy.DestinationTableName = "Person";
            copy.WriteToServer(table);
        }
    }
}

3. Speed comparison of the three schemes

programme Number Time
Insert 1 A thousand 145.4351ms
BatchInsert 1 A thousand 103.9061ms
SqlBulkCopy 1 A thousand 7.021ms
Insert 1 Ten thousand 1501.326ms
BatchInsert 1 Ten thousand 850.6274ms
SqlBulkCopy 1 Ten thousand 30.5129ms
Insert 10 Ten thousand 13875.4934ms
BatchInsert 10 Ten thousand 8278.9056ms
SqlBulkCopy 10 Ten thousand 314.8402ms
  • Comparison of insertion efficiency between the two ,Insert Obvious ratio SqlBulkCopy It's too slow , Probably 20~40 Times the performance gap , Next we will SqlBulkCopy Encapsulate the , Make batch insertion more convenient

Two 、SqlBulkCopy Encapsulates the code

1. Methods to introduce

Batch insert extended method signature

Method Method parameter Introduce
BulkCopy Synchronous batch insertion method
SqlConnection connection sql server Connection object
IEnumerable<T> source Data sources that need to be inserted in bulk
string tableName = null Insert table name 【 by NULL The default is entity name 】
int bulkCopyTimeout = 30 Batch insert timeout
int batchSize = 0 Write a batch number to the database 【 If 0 For all one-time insertion 】 The most suitable quantity 【 It depends on your environment , Especially the number of rows and network latency . Personally , I'll start with BatchSize Property is set to 1000 OK, let's start , And see how it works . If possible , Then I'll double the number of lines ( For example, add to 2000、4000 etc. ), Until performance degradation or timeout . otherwise , If the timeout occurs in 1000, So I'll cut the number of lines in half ( for example 500), Until it works .】
SqlBulkCopyOptions options = SqlBulkCopyOptions.Default Batch copy parameters
SqlTransaction externalTransaction = null Executed transaction object
BulkCopyAsync Asynchronous batch insertion method
SqlConnection connection sql server Connection object
IEnumerable<T> source Data sources that need to be inserted in bulk
string tableName = null Insert table name 【 by NULL The default is entity name 】
int bulkCopyTimeout = 30 Batch insert timeout
int batchSize = 0 Write a batch number to the database 【 If 0 For all one-time insertion 】 The most suitable quantity 【 It depends on your environment , Especially the number of rows and network latency . Personally , I'll start with BatchSize Property is set to 1000 OK, let's start , And see how it works . If possible , Then I'll double the number of lines ( For example, add to 2000、4000 etc. ), Until performance degradation or timeout . otherwise , If the timeout occurs in 1000, So I'll cut the number of lines in half ( for example 500), Until it works .】
SqlBulkCopyOptions options = SqlBulkCopyOptions.Default Batch copy parameters
SqlTransaction externalTransaction = null Executed transaction object
  • This method mainly solves two problems :
    • No manual build DataTable perhaps IDataReader Interface implementation class , Manually built transformations are more difficult to maintain , If you change the field, you have to change all these places , In particular, enumeration types need to be treated specially , To his basic type ( Default int
    • You don't have to create it yourself SqlBulkCopy object , And configuration database column mapping , And the configuration of some properties
  • This scheme is also used in our company , To meet the company's needs for batch data insertion , For example, third party reconciliation data
  • This method uses Expression Dynamic generation of data conversion functions , It's as efficient as handwritten native code , Compared with native handwritten code , The extra conversion loss is small 【 The biggest performance loss is in Value type On the box 】
  • This project is different from other online programs in that : Not will List First convert to DataTable, And then write SqlBulkCopy Of , It's using an implementation IDataReader The reader package for List, Every time SqlBulkCopy Insert a row of data to convert a row of data
  • IDataReader Scheme and DataTable The advantages of the scheme are
    • Efficient :DataTable The solution needs to be completely converted before , It's up to SqlBulkCopy Write to database , and IDataReader The scheme can be transferred to SqlBulkCopy Write to database ( for example :10 Ten thousand data insertion speed can be improved 30%
    • Less memory :DataTable The solution needs to be completely converted before , It's up to SqlBulkCopy Write to database , It takes up a lot of memory , and IDataReader The scheme can be transferred to SqlBulkCopy Write to database , It doesn't take up too much memory
    • Powerful : Because it's writing while converting , and EnumerableReader What is passed in is an iterator , Can achieve the effect of continuous data insertion

2. Realization principle

① Entity Model Mapping to tables

  • Database table code
CREATE TABLE [dbo].[Person](
	[Id] [BIGINT] NOT NULL,
	[Name] [VARCHAR](64) NOT NULL,
	[Age] [INT] NOT NULL,
	[CreateTime] [DATETIME] NULL,
	[Sex] [INT] NOT NULL,
PRIMARY KEY CLUSTERED 
(
	[Id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
  • Entity class code
public class Person
{
    public long Id { get; set; }
    public string Name { get; set; }
    public int Age { get; set; }
    public DateTime? CreateTime { get; set; }
    public Gender Sex { get; set; }
}

public enum Gender
{
    Man = 0,
    Woman = 1
}
  • Create field mapping 【 If there is no such field mapping, the data will be filled in the wrong position , If the type is not correct, it will cause an error 】【 because : If there is no such field mapping, it is inserted according to the column number by default 】
  • Create a map using SqlBulkCopy Type of ColumnMappings Properties to complete , The mapping between data columns and columns in the database
// Create batch insert objects 
using (var copy = new SqlBulkCopy(connection, options, externalTransaction))
{
    foreach (var column in ModelToDataTable<TModel>.Columns)
    {
        // Create field mapping 
        copy.ColumnMappings.Add(column.ColumnName, column.ColumnName);
    }
}

② Entity to data row

  • The conversion of data to data rows uses : Reflection +Expression To complete
    • among Reflection It's used to get and write Expression Required program class , Properties and other information
    • among Expression Is used to generate efficient conversion functions
  • among ModelToDataTable<TModel> Types take advantage of static generic class features , Implement the caching effect of generic parameters
  • stay ModelToDataTable<TModel> In the static constructor of , Generating conversion functions , Get the attribute information to be transformed , And save it in the static read-only field , Cache complete

③ Use IDataReader Insert data overload

  • EnumerableReader Is to implement the IDataReader Interface reading class , Used to put model objects , Read it from the iterator , And convert it into data rows , Available SqlBulkCopy Read
  • SqlBulkCopy Only three methods will be called :GetOrdinalReadGetValue
    • among GetOrdinal Only the ordinal number represented by each column is read in the first row 【 Need to fill in :SqlBulkCopy Type of ColumnMappings attribute 】
    • among Read The method is to iterate to the next line , And call ModelToDataTable<TModel>.ToRowData.Invoke() To convert model objects into data rows object[]
    • among GetValue The method is to get the value of the specified subscript position of the current line

3. Complete code

Extension method class

    public static class SqlConnectionExtension
    {
        /// <summary>
        ///  Bulk copy 
        /// </summary>
        /// <typeparam name="TModel"> Inserted model objects </typeparam>
        /// <param name="source"> Data sources that need to be inserted in bulk </param>
        /// <param name="connection"> Database connection object </param>
        /// <param name="tableName"> Insert table name 【 by NULL The default is entity name 】</param>
        /// <param name="bulkCopyTimeout"> Insert timeout </param>
        /// <param name="batchSize"> Write a batch number to the database 【 If 0 For all one-time insertion 】 The most suitable quantity 【 It depends on your environment , Especially the number of rows and network latency . Personally , I'll start with BatchSize Property is set to 1000 OK, let's start , And see how it works . If possible , Then I'll double the number of lines ( For example, add to 2000、4000 etc. ), Until performance degradation or timeout . otherwise , If the timeout occurs in 1000, So I'll cut the number of lines in half ( for example 500), Until it works .】</param>
        /// <param name="options"> Batch copy parameters </param>
        /// <param name="externalTransaction"> Executed transaction object </param>
        /// <returns> Insert number </returns>
        public static int BulkCopy<TModel>(this SqlConnection connection,
            IEnumerable<TModel> source,
            string tableName = null,
            int bulkCopyTimeout = 30,
            int batchSize = 0,
            SqlBulkCopyOptions options = SqlBulkCopyOptions.Default,
            SqlTransaction externalTransaction = null)
        {
            // Create a reader 
            using (var reader = new EnumerableReader<TModel>(source))
            {
                // Create batch insert objects 
                using (var copy = new SqlBulkCopy(connection, options, externalTransaction))
                {
                    // Inserted table 
                    copy.DestinationTableName = tableName ?? typeof(TModel).Name;
                    // Write a batch number to the database 
                    copy.BatchSize = batchSize;
                    // Timeout time 
                    copy.BulkCopyTimeout = bulkCopyTimeout;
                    // Create field mapping 【 If there is no such field mapping, the data will be filled in the wrong position , If the type is not correct, it will cause an error 】【 because : If there is no such field mapping, it is inserted according to the column number by default 】
                    foreach (var column in ModelToDataTable<TModel>.Columns)
                    {
                        // Create field mapping 
                        copy.ColumnMappings.Add(column.ColumnName, column.ColumnName);
                    }
                    // Write the data to the database in batches 
                    copy.WriteToServer(reader);
                    // Returns the number of inserted data 
                    return reader.Depth;
                }
            }
        }

        /// <summary>
        ///  Bulk copy - asynchronous 
        /// </summary>
        /// <typeparam name="TModel"> Inserted model objects </typeparam>
        /// <param name="source"> Data sources that need to be inserted in bulk </param>
        /// <param name="connection"> Database connection object </param>
        /// <param name="tableName"> Insert table name 【 by NULL The default is entity name 】</param>
        /// <param name="bulkCopyTimeout"> Insert timeout </param>
        /// <param name="batchSize"> Write a batch number to the database 【 If 0 For all one-time insertion 】 The most suitable quantity 【 It depends on your environment , Especially the number of rows and network latency . Personally , I'll start with BatchSize Property is set to 1000 OK, let's start , And see how it works . If possible , Then I'll double the number of lines ( For example, add to 2000、4000 etc. ), Until performance degradation or timeout . otherwise , If the timeout occurs in 1000, So I'll cut the number of lines in half ( for example 500), Until it works .】</param>
        /// <param name="options"> Batch copy parameters </param>
        /// <param name="externalTransaction"> Executed transaction object </param>
        /// <returns> Insert number </returns>
        public static async Task<int> BulkCopyAsync<TModel>(this SqlConnection connection,
            IEnumerable<TModel> source,
            string tableName = null,
            int bulkCopyTimeout = 30,
            int batchSize = 0,
            SqlBulkCopyOptions options = SqlBulkCopyOptions.Default,
            SqlTransaction externalTransaction = null)
        {
            // Create a reader 
            using (var reader = new EnumerableReader<TModel>(source))
            {
                // Create batch insert objects 
                using (var copy = new SqlBulkCopy(connection, options, externalTransaction))
                {
                    // Inserted table 
                    copy.DestinationTableName = tableName ?? typeof(TModel).Name;
                    // Write a batch number to the database 
                    copy.BatchSize = batchSize;
                    // Timeout time 
                    copy.BulkCopyTimeout = bulkCopyTimeout;
                    // Create field mapping 【 If there is no such field mapping, the data will be filled in the wrong position , If the type is not correct, it will cause an error 】【 because : If there is no such field mapping, it is inserted according to the column number by default 】
                    foreach (var column in ModelToDataTable<TModel>.Columns)
                    {
                        // Create field mapping 
                        copy.ColumnMappings.Add(column.ColumnName, column.ColumnName);
                    }
                    // Write the data to the database in batches 
                    await copy.WriteToServerAsync(reader);
                    // Returns the number of inserted data 
                    return reader.Depth;
                }
            }
        }
    }

Encapsulated iterator data reader

    /// <summary>
    ///  Iterator data reader 
    /// </summary>
    /// <typeparam name="TModel"> Model type </typeparam>
    public class EnumerableReader<TModel> : IDataReader
    {
        /// <summary>
        ///  Instantiated iterators read objects 
        /// </summary>
        /// <param name="source"> Model source </param>
        public EnumerableReader(IEnumerable<TModel> source)
        {
            _source = source ?? throw new ArgumentNullException(nameof(source));
            _enumerable = source.GetEnumerator();
        }

        private readonly IEnumerable<TModel> _source;
        private readonly IEnumerator<TModel> _enumerable;
        private object[] _currentDataRow = Array.Empty<object>();
        private int _depth;
        private bool _release;

        public void Dispose()
        {
            _release = true;
            _enumerable.Dispose();
        }

        public int GetValues(object[] values)
        {
            if (values == null) throw new ArgumentNullException(nameof(values));
            var length = Math.Min(_currentDataRow.Length, values.Length);
            Array.Copy(_currentDataRow, values, length);
            return length;
        }

        public int GetOrdinal(string name)
        {
            for (int i = 0; i < ModelToDataTable<TModel>.Columns.Count; i++)
            {
                if (ModelToDataTable<TModel>.Columns[i].ColumnName == name) return i;
            }

            return -1;
        }

        public long GetBytes(int ordinal, long dataIndex, byte[] buffer, int bufferIndex, int length)
        {
            if (dataIndex < 0) throw new Exception($" The starting subscript cannot be less than 0!");
            if (bufferIndex < 0) throw new Exception(" Target buffer starting subscript cannot be less than 0!");
            if (length < 0) throw new Exception(" Read length cannot be less than 0!");
            var numArray = (byte[])GetValue(ordinal);
            if (buffer == null) return numArray.Length;
            if (buffer.Length <= bufferIndex) throw new Exception(" Target buffer starting index cannot be larger than target buffer range !");
            var freeLength = Math.Min(numArray.Length - bufferIndex, length);
            if (freeLength <= 0) return 0;
            Array.Copy(numArray, dataIndex, buffer, bufferIndex, length);
            return freeLength;
        }

        public long GetChars(int ordinal, long dataIndex, char[] buffer, int bufferIndex, int length)
        {
            if (dataIndex < 0) throw new Exception($" The starting subscript cannot be less than 0!");
            if (bufferIndex < 0) throw new Exception(" Target buffer starting subscript cannot be less than 0!");
            if (length < 0) throw new Exception(" Read length cannot be less than 0!");
            var numArray = (char[])GetValue(ordinal);
            if (buffer == null) return numArray.Length;
            if (buffer.Length <= bufferIndex) throw new Exception(" Target buffer starting index cannot be larger than target buffer range !");
            var freeLength = Math.Min(numArray.Length - bufferIndex, length);
            if (freeLength <= 0) return 0;
            Array.Copy(numArray, dataIndex, buffer, bufferIndex, length);
            return freeLength;
        }

        public bool IsDBNull(int i)
        {
            var value = GetValue(i);
            return value == null || value is DBNull;
        }
        public bool NextResult()
        {
            // Move to the next element 
            if (!_enumerable.MoveNext()) return false;
            // Row layer +1
            Interlocked.Increment(ref _depth);
            // Get the data row 
            _currentDataRow = ModelToDataTable<TModel>.ToRowData.Invoke(_enumerable.Current);
            return true;
        }

        public byte GetByte(int i) => (byte)GetValue(i);
        public string GetName(int i) => ModelToDataTable<TModel>.Columns[i].ColumnName;
        public string GetDataTypeName(int i) => ModelToDataTable<TModel>.Columns[i].DataType.Name;
        public Type GetFieldType(int i) => ModelToDataTable<TModel>.Columns[i].DataType;
        public object GetValue(int i) => _currentDataRow[i];
        public bool GetBoolean(int i) => (bool)GetValue(i);
        public char GetChar(int i) => (char)GetValue(i);
        public Guid GetGuid(int i) => (Guid)GetValue(i);
        public short GetInt16(int i) => (short)GetValue(i);
        public int GetInt32(int i) => (int)GetValue(i);
        public long GetInt64(int i) => (long)GetValue(i);
        public float GetFloat(int i) => (float)GetValue(i);
        public double GetDouble(int i) => (double)GetValue(i);
        public string GetString(int i) => (string)GetValue(i);
        public decimal GetDecimal(int i) => (decimal)GetValue(i);
        public DateTime GetDateTime(int i) => (DateTime)GetValue(i);
        public IDataReader GetData(int i) => throw new NotSupportedException();
        public int FieldCount => ModelToDataTable<TModel>.Columns.Count;
        public object this[int i] => GetValue(i);
        public object this[string name] => GetValue(GetOrdinal(name));
        public void Close() => Dispose();
        public DataTable GetSchemaTable() => ModelToDataTable<TModel>.ToDataTable(_source);
        public bool Read() => NextResult();
        public int Depth => _depth;
        public bool IsClosed => _release;
        public int RecordsAffected => 0;
    }

Model object to data line tool class

    /// <summary>
    ///  Object conversion to DataTable Conversion class 
    /// </summary>
    /// <typeparam name="TModel"> The generic type </typeparam>
    public static class ModelToDataTable<TModel>
    {
        static ModelToDataTable()
        {
            // If you need to weed out some columns, you can modify this code 
            var propertyList = typeof(TModel).GetProperties().Where(w => w.CanRead).ToArray();
            Columns = new ReadOnlyCollection<DataColumn>(propertyList
                .Select(pr => new DataColumn(pr.Name, GetDataType(pr.PropertyType))).ToArray());
            // Generate object to data line delegation 
            ToRowData = BuildToRowDataDelegation(typeof(TModel), propertyList);
        }

        /// <summary>
        ///  Build convert to data row delegation 
        /// </summary>
        /// <param name="type"> Afferent type </param>
        /// <param name="propertyList"> Properties of the transformation </param>
        /// <returns> Transform data row delegation </returns>
        private static Func<TModel, object[]> BuildToRowDataDelegation(Type type, PropertyInfo[] propertyList)
        {
            var source = Expression.Parameter(type);
            var items = propertyList.Select(property => ConvertBindPropertyToData(source, property));
            var array = Expression.NewArrayInit(typeof(object), items);
            var lambda = Expression.Lambda<Func<TModel, object[]>>(array, source);
            return lambda.Compile();
        }

        /// <summary>
        ///  Convert attributes to data 
        /// </summary>
        /// <param name="source"> Source variable </param>
        /// <param name="property"> Attribute information </param>
        /// <returns> Get attribute data expression </returns>
        private static Expression ConvertBindPropertyToData(ParameterExpression source, PropertyInfo property)
        {
            var propertyType = property.PropertyType;
            var expression = (Expression)Expression.Property(source, property);
            if (propertyType.IsEnum)
                expression = Expression.Convert(expression, propertyType.GetEnumUnderlyingType());
            return Expression.Convert(expression, typeof(object));
        }

        /// <summary>
        ///  Get data type 
        /// </summary>
        /// <param name="type"> Attribute types </param>
        /// <returns> data type </returns>
        private static Type GetDataType(Type type)
        {
            // Enumeration is converted to the corresponding value type by default 
            if (type.IsEnum)
                return type.GetEnumUnderlyingType();
            // Null type 
            if (type.IsGenericType && type.GetGenericTypeDefinition() == typeof(Nullable<>))
                return GetDataType(type.GetGenericArguments().First());
            return type;
        }

        /// <summary>
        ///  Column set 
        /// </summary>
        public static IReadOnlyList<DataColumn> Columns { get; }

        /// <summary>
        ///  Object to data line delegation 
        /// </summary>
        public static Func<TModel, object[]> ToRowData { get; }

        /// <summary>
        ///  Set to DataTable
        /// </summary>
        /// <param name="source"> aggregate </param>
        /// <param name="tableName"> The name of the table </param>
        /// <returns> Conversion completed DataTable</returns>
        public static DataTable ToDataTable(IEnumerable<TModel> source, string tableName = "TempTable")
        {
            // Create table objects 
            var table = new DataTable(tableName);
            // Set Columns 
            foreach (var dataColumn in Columns)
            {
                table.Columns.Add(new DataColumn(dataColumn.ColumnName, dataColumn.DataType));
            }

            // Loop through each row of data 
            foreach (var item in source)
            {
                table.Rows.Add(ToRowData.Invoke(item));
            }

            // Return table object 
            return table;
        }
    }

3、 ... and 、 Test package code

1. Test code

Create table code

CREATE TABLE [dbo].[Person](
	[Id] [BIGINT] NOT NULL,
	[Name] [VARCHAR](64) NOT NULL,
	[Age] [INT] NOT NULL,
	[CreateTime] [DATETIME] NULL,
	[Sex] [INT] NOT NULL,
PRIMARY KEY CLUSTERED 
(
	[Id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]

Entity class code

  • The attribute name of the defined entity needs to be and SqlServer Column name type corresponds to
public class Person
{
    public long Id { get; set; }
    public string Name { get; set; }
    public int Age { get; set; }
    public DateTime? CreateTime { get; set; }
    public Gender Sex { get; set; }
}

public enum Gender
{
    Man = 0,
    Woman = 1
}

The test method

// Generate 10 Ten thousand data 
var persons = new Person[100000];
var random = new Random();
for (int i = 0; i < persons.Length; i++)
{
    persons[i] = new Person
    {
        Id = i + 1,
        Name = " Zhang San " + i,
        Age = random.Next(1, 128),
        Sex = (Gender)random.Next(2),
        CreateTime = random.Next(2) == 0 ? null : (DateTime?) DateTime.Now.AddSeconds(i)
    };
}

// Create database connection 
using (var conn = new SqlConnection("Server=.;Database=DemoDataBase;User ID=sa;Password=8888;"))
{
    conn.Open();
    var sw = Stopwatch.StartNew();
    // Bulk insert data 
    var qty = conn.BulkCopy(persons);
    sw.Stop();
    Console.WriteLine(sw.Elapsed.TotalMilliseconds + "ms");
}

Perform batch insert results

226.4767ms
 Please press any key to continue . . .

 Insert picture description here

Four 、 The code download

GitHub Code address :https://github.com/liu-zhen-liang/PackagingComponentsSet/tree/main/SqlBulkCopyComponents

版权声明
本文为[NewAI]所创,转载请带上原文链接,感谢
https://chowdera.com/2020/12/20201208085115161w.html