Note that there are some explanatory texts on larger screens.

plurals
  1. PO
    text
    copied!<p>It all depends on the performance requirements and the general practices you use. Rune's answer can be perfectly fine. If you are inserting 100,000 rows look at a bulk inserter. </p> <p>If you are used to writing stored procs and you are lucky enough to be running SQL 2008 you can make use of <a href="http://www.stephenforte.net/PermaLink,guid,07dfeb00-d0b0-47dd-9761-3b4c9f160277.aspx" rel="nofollow noreferrer">table valued params</a> </p> <p>This allows you to do stuff like this: </p> <pre><code>SqlCommand cmd = new SqlCommand("usp_ins_Portfolio", conn); cmd.CommandType = CommandType.StoredProcedure; //add the ds here as a tvp SqlParameter sp = cmd.Parameters.AddWithValue("@Portfolio", ds.Tables[0]); //notice structured sp.SqlDbType = SqlDbType.Structured; cmd.ExecuteNonQuery(); </code></pre> <p>Then a single call to a stored proc can insert all the rows required into the Tag table.</p> <p>For SQL 2005 and below I usually will use a single comma separated param for all the values, and split it in TSQL inside a stored proc. This tends to perform quite well and avoids mucking around with temp tables. It is also secure, but you have to ensure you use a text input param for the proc or have some sort of limit or batching mechanism in code (so you do not truncate long lists). </p> <p>For ideas on how to split up lists in TSQL have a look at Erland's <a href="http://www.sommarskog.se/arrays-in-sql-2005.html" rel="nofollow noreferrer">excellent article</a>.</p> <p>Sql 2000 version of the <a href="http://www.sommarskog.se/arrays-in-sql-2000.html" rel="nofollow noreferrer">article is here</a>. </p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload