How to create an XML file from a trigger? | SQL Server Performance Forums

SQL Server Performance Forum – Threads Archive

How to create an XML file from a trigger?

This is a 3rd party requirement, so don’t ask why … We can query data from an existing XML file with OPENXML in the FROM clause, but we can’t write to it. We can generate an XML-formatted string using the FOR XML clause, but we don’t have the option to write this to a file. So how can we write a FOR XML query as a file on disk? Only way seems to be EXEC dbo.xp<_>cmd<_>shell ‘bcp …………’ … and run for our lives [8D] Any other suggestions?
<br />no other option that I can think of… unless you could write a dcom component which does the query and the writing and call that from within the trigger?<br /><br />it does beg the question though…. why? <img src=’/community/emoticons/emotion-5.gif’ alt=’;-)’ /><br /><br />Cheers<br />Twan
Hi Twan, Thanks for confirming our worst fears. You really want to know? Cheers,
Adriaan
Hi ya,<br /><br />yeah, everyone needs a good laugh sometimes <img src=’/community/emoticons/emotion-5.gif’ alt=’;-)’ /><br /><br />Cheers<br />Twan
Well, to be brutally honest, us not writing a dcom is a matter of time available for the project and in-house lack of experience with that kind of development work. Requirement is to feed information from our database installed at a client, into a document management system installed by a 3rd party. Data feed must be triggered by data entry on specific columns in specific tables. The data feed should occur immediately when the data is committed, and the selected mode of transport is an XML document. The pros of XML here are that both sides can always blame the other party, and neither side has to know about the other software. The cons are of course that SQL 2000 doesn’t really support generating XML from T-SQL (for several good reasons even I can think of) and that the whole formatting thing could wreak havoc with response times. We’re thinking of using a staging table to which we feed the row identifiers, and then either (1) have an insert trigger on the staging table handle generating the XML (but we’re suspecting that the data entry table will still wait for the trigger on the staging table to end, before committing its own transaction) or (2) have a job scheduled every X seconds to check the staging table, and let that generate the XML through a stored procedure, and clean up after itself – would cause a slight delay but perhaps necessary to avoid an increase in response time.
alrighty, the beauty of integration hey <img src=’/community/emoticons/emotion-5.gif’ alt=’;-)’ /><br /><br />in terms of option 1 yes the insert on the data entry table, if this causes the insert into the stagin table, will wait for the trigger on the staging table before committing<br /><br />so perhaps a scheduled job may be preferable to avoid data-entry problems (also makes it easy to redo the XML generation process)<br /><br />Cheers<br />Twan
Twan, I thought of using a table hint when inserting into the staging table, to avoid the problem with option 1. But we can’t use WITH (NOLOCK) for the target table of an insert query, only FASTFIRSTROW, HOLDLOCK, PAGLOCK, READCOMMITTED, REPEATABLEREAD, ROWLOCK, SERALIZABLE, TABLOCK, TABLOCKX or UPDLOCK. Would it make any difference if we use any of those hints? Would an INSTEAD OF trigger on the staging table avoid the delay on the data entry table? An issue with a scheduled job could be the timing – we’re expecting some data entry every few minutes, could be in quick succession or more spread-out, but nothing like high-volume.
Hi ya, no the table hints won’t help the fact that all triggers process synchronously to the original DML statement yep a job will cause a slight lag, but then you will already have a lag on reading the xml file into the target system won’t you? especially if the target system is down or inaccessible for any reason? Cheers
Twan
The target system is not ours to worry about – we dump some XML files, they crunch them. The delay would be only a minor concern, if we are to avoid the job firing every second. Just browsing the job wizard of EM – sometimes you do learn new stuff in EM! – and confirmed through BOL … In sp_add_jobstep you can set a file name where the results of the job step will be stored. So if the job step is calling an SP that returns a FOR XML type of query, then I guess we can bypass the xp_cmdshell and bcp hassle … I suppose we have to clean up after ourselves with sp_delete_jobstep – but how do we know the step index? Perhaps have another job scheduled to clean up the older job steps? Or do one-time job steps get dropped automatically after execution? So I guess we have to decide between the hassle of bcp and cmdshell, and the overhead involved with job steps.
Okay, we can use the @step_name parameter of sp_add_jobstep, and query the stepid based on the step name. Still open to suggestions!
why not just have a job which checks the staging table every 15 secs, yes it causes a slight lag time, but it avoids having to get more complex with adding/removing jobsteps? Cheers
Twan
Of course that is the easiest setup. I just noticed that we can dump the results of any job step into a file on disk – which is exactly the functionality I need, and it looked as if we could make it behave similar to a trigger. But the whole job scheduling stuff probably causes too many problems in itself – and perhaps one shouldn’t create and drop potentially thousands of jobs each day, right? (Apologies if I start to sound like a cabinet minister.)
]]>