Note that there are some explanatory texts on larger screens.

plurals
  1. POSQL Server and CLR, batching SqlFunction
    text
    copied!<p>I have a CLR function that returns "n" rows with random data. For example, to prime an empty table with 100 rows of test data I could write</p> <pre><code>INSERT INTO CustomerInfo(FirstName, LastName, City...) SELECT FirstName, LastName, City... FROM MyCLRFunction(100) </code></pre> <p>This would return 100 "customers" with random information. If I were to call this with a very high number I would get an out of memory error, since the entire dataset is created before it gets sent to the caller. I can, of course, use the SqlPipe object and send rows as they are created but as far as I can tell you can only use this approach with SqlProcedures. That would mean that I can't use an INSERT INTO approach since you can't SELECT from a stored proc.</p> <p>I'm hoping that I've just missed something here and that it is actually possible to combine SqlPipe.SendResultRow with a function, or that someone has a clever workaround. </p> <p>I <em>could</em> leave it as a proc and have that proc put these records into a session-scoped temporary table. Then the caller could use that table in their SELECT clause but I'm hoping for the best of all worlds where I can provide a nice, clean syntax to the caller and still scale to a large number of records.</p> <p>Frankly, the original solution is probably "good enough" since we will probably never want this much test data and even if we did we could run the INSERT statement multiple times. But I'm trying to get a full understanding of CLR integration and wondering how I would address this if a similar use case presented itself in a business scenario.</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload