Tuesday, March 27, 2012

Best solution: SQLXML, XML Data Binding, or MSXML?

I need some help to determine what the best solution would be for the
following scenarios where the dataset is large and performance is key:
Scenario #1
I am trying to retrieve relational data in XML from a SQL Server 2000 stored
proc. My client is a VBA Access application. I want to be able to map the xm
l
elements to db tables/columns and use a stored proc to collect the data and
build the xml. Validate it against the xsd. I am assuming that with large
datasets it would be faster to create the xml on the server than on the
client - desktop pcs - since I would have to loop through data rows to
create the xml using MSXML on the client, thoughts. What about annotated
schemas. Do you have to use along with XPath queries even if your result is
already filtered in an SQL sproc.
Scenario #2
I have a .Net webservice that will be the recepient of this xml and validate
it and persisit it to SQL 2K. After looking at all available options it seem
s
to me that using XML Data Binding & creating .Net classes based on the xsd &
loading this .Net object from xml is the cleanest solution. Reading into
SQLXML, it seems as though every method to persist data to SQL Server uses
the COM-based SQLXMLOLEDB provider and my code will have to jump from CLR to
COM. Cant see how that is good. And MSXML is also COM based parser.
Thoughts?Scenario #1: I would look at the FOR XML functionality of SQL Server 2000 if
you want to do it in the database. Annotated schemas do not work with stored
procs. You can look into the client-side FOR XML of SQLXML 3.0 if you want
to run FOR XML over the result of a stored proc.
Re Scenario #2: The SQLXML Bulkload object can be called through the managed
providers (in the latest SP of SQLXML 3.0). You will go through COM interop,
which you don't have to for the data binding. But I think the loading may
still be faster due to the bulkload mechanism. I would suggest doing some
perf tests for your specific data...
Best regards
Michael
"Aazihh" <nowayjose@.newsgroupms.com> wrote in message
news:D56903DB-3984-4EE4-9CE1-5E1CCC9FBEA6@.microsoft.com...
>I need some help to determine what the best solution would be for the
> following scenarios where the dataset is large and performance is key:
> Scenario #1
> I am trying to retrieve relational data in XML from a SQL Server 2000
> stored
> proc. My client is a VBA Access application. I want to be able to map the
> xml
> elements to db tables/columns and use a stored proc to collect the data
> and
> build the xml. Validate it against the xsd. I am assuming that with large
> datasets it would be faster to create the xml on the server than on the
> client - desktop pcs - since I would have to loop through data rows to
> create the xml using MSXML on the client, thoughts. What about annotated
> schemas. Do you have to use along with XPath queries even if your result
> is
> already filtered in an SQL sproc.
> Scenario #2
> I have a .Net webservice that will be the recepient of this xml and
> validate
> it and persisit it to SQL 2K. After looking at all available options it
> seems
> to me that using XML Data Binding & creating .Net classes based on the xsd
> &
> loading this .Net object from xml is the cleanest solution. Reading into
> SQLXML, it seems as though every method to persist data to SQL Server uses
> the COM-based SQLXMLOLEDB provider and my code will have to jump from CLR
> to
> COM. Cant see how that is good. And MSXML is also COM based parser.
> Thoughts?

No comments:

Post a Comment