Saturday, February 25, 2012
Best design for a service that will monitor db
information on the status and attributes of specific objects within the
application in batches. As these records come in, there is portions of the
table that they populate that are intentionally left empty, because the data
for these fields is retrieved from a seperate Java application through a
published web service (on same network). We are constructing a .Net service
which will handle the retrieval of records from the Java app and push the
new data into the relevant fields.
We are currently designing the .NET service to check the database on a
predefined interval, to see if any new records have appeared that need to be
looked up in the Java application. However, it would be preferable (at
least for testing) if this interaction could be designed so that the Sql
Server 2000 database could notify the .Net service that a new batch of
records has arrived (push instead of pull). Does anyone have any knowledge
if there is a means by which this can be accomplished?
Thanks.Hmmm ... There surely is a notificaiton service in SQL Server but as far as
I've read the documentation it maynot be suitable for this problem ...
But I think you can write a trigger that can call a DTS package or a Jobs
framework to do this notification ... I think this can also be one solution
...
--
HTH,
Vinod Kumar
MCSE, DBA, MCAD
http://www.extremeexperts.com
"nfalconer" <navid@.gci.net> wrote in message
news:vm4o27o29cu596@.corp.supernews.com...
> We have an existing database that is constantly receiving updated
> information on the status and attributes of specific objects within the
> application in batches. As these records come in, there is portions of
the
> table that they populate that are intentionally left empty, because the
data
> for these fields is retrieved from a seperate Java application through a
> published web service (on same network). We are constructing a .Net
service
> which will handle the retrieval of records from the Java app and push the
> new data into the relevant fields.
> We are currently designing the .NET service to check the database on a
> predefined interval, to see if any new records have appeared that need to
be
> looked up in the Java application. However, it would be preferable (at
> least for testing) if this interaction could be designed so that the Sql
> Server 2000 database could notify the .Net service that a new batch of
> records has arrived (push instead of pull). Does anyone have any
knowledge
> if there is a means by which this can be accomplished?
> Thanks.
>
Friday, February 24, 2012
best approach for "trigger on commit"?
We have multiple tables that get updated as part of a transaction.
After all the data is added succesfully there is processing that needs
to be done, so we'd like to use a trigger to kick it off automatically. The schema can be thought of as including a "master report" table that gets updated once whenever a new report is submitted and multiple related tables each corresponding to different report sections. The PK-FK relations imply that the master report will be updated prior to the subsidiary sections, but the exact order of updates to the subtables is not defined.
I thought of attaching a trigger to the master report table but I can't find an approach or mechanism that would allow us to defer the triggered code until after all the tables have been updated.
It almost seems like I want a transaction trigger. Something that would be executed before or after a COMMIT TRAN. But the
post-processing code does not have to be deferred until after the
transaction is committed, it just mustn't be started until all the
tables participating in the transaction have been updated, otherwise
all the data needed won't be present.
There are several work-arounds that come to mind, but none are as
elegant as having a TRIGGER ON COMMIT capability. I'd be interested in
how other people have solved this problem.
Here's some of the work-arounds we came up with
1. Identify the last table to be updated, and place a trigger on that. Ugly for obvious reasons and very prone to breaking.
2. Create an "update history" table with a trigger on it. After the
actual transaction commits, a row should be added to this table by the
db user. Less ugly, but still not pretty.
3. Variation on #2 -- after an update transaction, have the database user call a stored procedure. Just as ugly as #2
4. Create a view that is used to write through to the underlaying tables, and have the trigger on the view. Unfortunately the large amount of data (approx 20 tables participate in one update transaction) makes this very unattractive. (Imagine an INSERT statement with 300 values!?)
Thanks in advance for suggestions and help
TerryPesonally, I would recommend any solution that does not use Triggers. Triggers are evil. They are a maintenance nightmare. After you finish your "transaction", you can call a stored proc that takes care of the rest of the updates which is a more controlled way of doing than doing it in a trigger. Do a simple google search on issues with triggers and you will be convinced enough to not use them.|||
We, I cannot agree with your feelings on triggers (Cursors are evil, not triggers. Triggers are just a bit difficult :) I do agree that this is not the place for a big trigger (considering how SQL Server triggers work.)
An alternative that I might suggest if this can be done asynchronously is that you have a job that queries your tables to see when the transaction is finished (either you check for the existance of data in all tables, or you write some sort of control row (possibly using triggers here) when you do each insert into the other tables. Then the process can check to see when all of the data for a "transaction" is completed and do your processing.
|||I can second Louis, triggers always had and will have a bad reputation because of the wrong implementation of users. Sure, in some cases they behave slow, they act syncronously (which can real nightmare if not considered) and are fired per statement not per row, so a bit implementation afford is needed here, but triggers can help your through the day for enforcing business rules which can′t be accomplished through any middle tier (if the database is directly access by several applications). So I would say that they are as evli as you force them to be :-)HTH, Jens K. Suessmeyer.
http://www.sqlserver2005.de|||After the database is developed it may have data added by an unknown number of subcontractors. So it is important that the "api" be as simple as possible and that the database be as intelligent as possible. Building business logic into the database in the form of SP and triggers has clear advantages for us over an approach that depends on the goodwill of users to follow our suggested/required calling structure.
In other words, having a document that says "after inserting all of a report call SP after_insert_processing()" opens up the door for someone to forget to call the SP. Having a mechanism in place that detects the operation and automatically calls the SP is much more attractive to us.
Hmmm.....|||BTW thanks for all of the suggestions. We may go with a job that checks the time of the last update and determines from that if the post-insert processing should be done.|||
You can do something like below:
1. Create a tracking table
2. Insert a row from the tracking table for each operation on the other table(s) (Need not be at insert level. You can determine the granularity)
3. Have the insert trigger on the tracking table determine the matching conditions to fire the SP. This can be done by checking for count of specific number of operations per report for example. The post_insert_processing SP can be fired using an on-demand SQLAgent job so that it is asynchronous.
You could do away with the tracking table approach and simply do the check in every SP that inserts into the other table(s) also. Either way you can centralize the logic to check in a SP (by using simple queries against the required table(s)).
|||My first question would be..."How do you currently ensure all information is updated correctly?"
My second question would be..."Is this a manual process that you want to include in your automation?"
It sounds as if there is some form of linear progression with a definitive end but you want to avoid this assumption for fear of change.
So...to satify your anxieties, create an independant action (the approach is insignificant).
Forget about the clever trickery or sly workaround, code a 101 solution to a 101 problem.
Adamus
Besides Replication, what other alternatives to sync data?
Every day, I will have to sync A & B's new/updated data to C and C to A/B. I know I can use Replication but I can't. You see, A & B are located in our own branch server while C was hosted on a third-party server. The webserver provider for C would not allow me to do any Replication on C.
What other alternatives that I can use to synchronise all 3?
DTS? XML transfer? It looks as if I have no choice but to write my own sync scripts? :((
Anyone has encountered similar situations? Any recommended SQL Tool programs?
Please help. I'm at a loss of what Im going to do.You can use Log Shipping but it is available in the Enterprise edition and the Developer edition which is Enterprise edition for five users only. Run a search for configuration info in the BOL(Books online). Hope this helps.
Kind regards,
Gift Peddie
Thursday, February 16, 2012
Behind the scenes SQL commands
The SQL database gets changed/updated through a web interface. I was
wondering if there was any way for me to see which SQL commands were being
run on the database when changed like deleting/inserting users are being
applied from the Web frontend?
Any help is appreciated, thanks.
RobertSure. Create a trace via Profiler. The text of stored procedures can be
viewed in Query Analyzer via sp_helptext if needed.
HTH
Jerry
"supersonic_oasis" <supersonicoasis@.discussions.microsoft.com> wrote in
message news:536DC58F-2216-4166-8D18-503CADB90B70@.microsoft.com...
> Hello, I am running SQL 2000 SP4 on a Windows 2003 standard edition
> server.
> The SQL database gets changed/updated through a web interface. I was
> wondering if there was any way for me to see which SQL commands were being
> run on the database when changed like deleting/inserting users are being
> applied from the Web frontend?
> Any help is appreciated, thanks.
> Robert
>
Behind the Scenes SQL commands
Hello, I am running SQL 2000 SP4 on a Windows 2003 standard edition server.
The SQL database gets changed/updated through a web interface. I was wondering if there was any way for me to see which SQL commands were being run on the database when changed like deleting/inserting users are being applied from the Web frontend?
Any help is appreciated, thanks.
Robert
Take a look in Books On Line at using Profiler, this is not available in MSDE, so if you are using MSDE then take a look for articles on uing SQL Trace.Behind the scenes SQL commands
The SQL database gets changed/updated through a web interface. I was
wondering if there was any way for me to see which SQL commands were being
run on the database when changed like deleting/inserting users are being
applied from the Web frontend?
Any help is appreciated, thanks.
RobertSure. Create a trace via Profiler. The text of stored procedures can be
viewed in Query Analyzer via sp_helptext if needed.
HTH
Jerry
"supersonic_oasis" <supersonicoasis@.discussions.microsoft.com> wrote in
message news:536DC58F-2216-4166-8D18-503CADB90B70@.microsoft.com...
> Hello, I am running SQL 2000 SP4 on a Windows 2003 standard edition
> server.
> The SQL database gets changed/updated through a web interface. I was
> wondering if there was any way for me to see which SQL commands were being
> run on the database when changed like deleting/inserting users are being
> applied from the Web frontend?
> Any help is appreciated, thanks.
> Robert
>