Tuesday, March 27, 2012
Best techniques for databse creation ?
Any one knows any link regarding best practices for schema and database crea
tion in SQL server 2000 in detail .
Faheem Latif
NETWORK SOLUTIONFaheem,
What specifically are you looking for? There are many things on the web if
you search google. However I recommend these links as a start:
SQL Server 2000 Operations Guide
http://www.microsoft.com/technet/pr...in/sqlops0.mspx
Vyas's article on Best Practices
http://www.sql-server-performance.c...t_practices.asp
Microsoft SQL Server 2000 Best Practices Analyzer 1.0 Beta
E07339C1F22&displaylang=en" target="_blank">http://www.microsoft.com/downloads/...&displaylang=en
Tinyurl: http://tinyurl.com/upzi
Mark Allison, SQL Server MVP
http://www.markallison.co.uk
"Faheem" <anonymous@.discussions.microsoft.com> wrote in message
news:189C9EC5-9492-4B34-A0D7-C9908D15BA0E@.microsoft.com...
> Dear Experts,
> Any one knows any link regarding best practices for schema and database
creation in SQL server 2000 in detail .
> Faheem Latif
> NETWORK SOLUTION
>
Best techniques for databse creation ?
Any one knows any link regarding best practices for schema and database creation in SQL server 2000 in detail
Faheem Latif
NETWORK SOLUTIONFaheem,
What specifically are you looking for? There are many things on the web if
you search google. However I recommend these links as a start:
SQL Server 2000 Operations Guide
http://www.microsoft.com/technet/prodtechnol/sql/2000/maintain/sqlops0.mspx
Vyas's article on Best Practices
http://www.sql-server-performance.com/vk_sql_best_practices.asp
Microsoft SQL Server 2000 Best Practices Analyzer 1.0 Beta
http://www.microsoft.com/downloads/details.aspx?FamilyId=B352EB1F-D3CA-44EE-893E-9E07339C1F22&displaylang=en
Tinyurl: http://tinyurl.com/upzi
--
Mark Allison, SQL Server MVP
http://www.markallison.co.uk
"Faheem" <anonymous@.discussions.microsoft.com> wrote in message
news:189C9EC5-9492-4B34-A0D7-C9908D15BA0E@.microsoft.com...
> Dear Experts,
> Any one knows any link regarding best practices for schema and database
creation in SQL server 2000 in detail .
> Faheem Latif
> NETWORK SOLUTION
>
Best techniques for databse creation ?
Any one knows any link regarding best practices for schema and database creation in SQL server 2000 in detail .
Faheem Latif
NETWORK SOLUTION
Faheem,
What specifically are you looking for? There are many things on the web if
you search google. However I recommend these links as a start:
SQL Server 2000 Operations Guide
http://www.microsoft.com/technet/pro...n/sqlops0.mspx
Vyas's article on Best Practices
http://www.sql-server-performance.co..._practices.asp
Microsoft SQL Server 2000 Best Practices Analyzer 1.0 Beta
http://www.microsoft.com/downloads/d...displaylang=en
Tinyurl: http://tinyurl.com/upzi
Mark Allison, SQL Server MVP
http://www.markallison.co.uk
"Faheem" <anonymous@.discussions.microsoft.com> wrote in message
news:189C9EC5-9492-4B34-A0D7-C9908D15BA0E@.microsoft.com...
> Dear Experts,
> Any one knows any link regarding best practices for schema and database
creation in SQL server 2000 in detail .
> Faheem Latif
> NETWORK SOLUTION
>
Sunday, March 25, 2012
Best practise regarding Tables?
Hi!
I have 6-7 tables total containing subobjects for different objects like phonenumbers and emails for contacts.
This meaning i have to do some querys on each detailpage. I will use stored proc for fetching subobjects.
My question therefore was: if i could merge subobjects into same tables making me use perhaps 2 querys instead of 4 and thus perhaps doubling the size
of the tables would this have a possibility of giving me any performance difference whatsoever?
As i see pros arefewer querys, and cons are larger tables and i will need another field separating the types of objects in the table.
Anyone have insight to this?
I would be curious to see what you mean by "subobjects" - usually when you ask a question like this it is a good idea to post your table structure. You'll get more specific help that way.
In general, you should strive for proper normalization of your database. This normally means more tables, and the tables are thin, not wide. However, if you have similar sets of data that can be "typed" and placed into the same table, that is normally a good route to take. For example, if you have a business phone numbers and home phone numbers, I would recommend putting these in the same table and adding a field to specify its type, rather than having two separate tables. On the other hand, you shouldn't try to squeeze more disparate types of data into the same table and type them.
|||By objects i mean like applications, users, groups.
By subobjects i mean phonenumbers, emails, contacts(for example when an application have a number of contacts linked to it) etc
To illustrate i post some of my tables:
First an object table:
CREATE TABLE [sw20aut].[sw_apps] (
[id] [int] IDENTITY (1, 1) NOT NULL ,
[kund_id] [int] NOT NULL ,
[namn] [varchar] (80) COLLATE Finnish_Swedish_CI_AS NOT NULL ,
[beskrivning] [text] COLLATE Finnish_Swedish_CI_AS NOT NULL DEFAULT ('') ,
[usr_rubrik] [varchar] (60) COLLATE Finnish_Swedish_CI_AS NOT NULL DEFAULT ('Övrig information') ,
[usr_information] [text] COLLATE Finnish_Swedish_CI_AS NOT NULL DEFAULT ('') ,
[kat] [int] NOT NULL DEFAULT('0') ,
[typ] [tinyint] NOT NULL DEFAULT('0') ,
[skapare] [varchar] (15) COLLATE Finnish_Swedish_CI_AS NOT NULL ,
[skapad] [datetime] NOT NULL ,
[del] [bit] NOT NULL DEFAULT ('0')
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
Here are some subtables:
CREATE TABLE [sw20aut].[sw_sub_email] (
[id] [int] IDENTITY (1, 1) NOT NULL ,
[p_typ] [varchar] (3) COLLATE Finnish_Swedish_CI_AS NOT NULL ,
[p_id] [int] NOT NULL ,
[epost] [varchar] (60) COLLATE Finnish_Swedish_CI_AS NULL ,
[kommentar] [varchar] (60) COLLATE Finnish_Swedish_CI_AS NOT NULL
) ON [PRIMARY]
CREATE TABLE [sw20aut].[sw_sub_phone] (
[id] [int] IDENTITY (1, 1) NOT NULL ,
[p_typ] [varchar] (3) COLLATE Finnish_Swedish_CI_AS NOT NULL ,
[p_id] [int] NOT NULL ,
[tel_land] [varchar] (6) COLLATE Finnish_Swedish_CI_AS NULL ,
[tel_rikt] [varchar] (6) COLLATE Finnish_Swedish_CI_AS NULL ,
[tel_nr] [varchar] (30) COLLATE Finnish_Swedish_CI_AS NULL ,
[kommentar] [varchar] (60) COLLATE Finnish_Swedish_CI_AS NOT NULL
) ON [PRIMARY]
CREATE TABLE [sw20aut].[sw_sub_contacts] (
[id] [int] IDENTITY (1, 1) NOT NULL ,
[p_typ] [varchar] (3) COLLATE Finnish_Swedish_CI_AS NOT NULL ,
[p_id] [int] NOT NULL ,
[kon_id] [int] NOT NULL ,
[kon_kid] [int] NOT NULL ,
[kommentar] [varchar] (60) COLLATE Finnish_Swedish_CI_AS NOT NULL
) ON [PRIMARY]
CREATE TABLE [sw20aut].[sw_sub_grupper] (
[id] [int] IDENTITY (1, 1) NOT NULL ,
[p_typ] [varchar] (3) COLLATE Finnish_Swedish_CI_AS NOT NULL ,
[p_id] [int] NOT NULL ,
[grp_id] [int] NOT NULL ,
[grp_kid] [int] NOT NULL ,
[kommentar] [varchar] (60) COLLATE Finnish_Swedish_CI_AS NOT NULL
) ON [PRIMARY]
p_id = the parents id
p_typ = type of parent, for example app, con or grp
As seen above i could join sub_contacts and sub_groups easily(need another field to set if its a contact or group though) and i could probably join email and phone as well, in worst case using some kind of commaseparated format to separate countrycode, areacode etc. I have a couple of other similar sub tables as well The question is if it could be worth the effort or the larger tables(more rows) and another field to sort by would negate the advantage of fewer querys when listing? Practically it would mean a step from 6-7 querys on worst pages down to 3-4.
I use batched dynamic querys and some stored procs depending on circumstances
The baseline for the table definition is files and association so if what I am seeing is correct you have maybe one or two secondary tables and the relationship is actually quantified by upper and lower bound Cardinality meaning number of expected rows. And I don't think name should be VarChar 80 more like 30-50. Try the links below for functional dependency and free tables you can clone. Hope this helps.
http://www.databaseanswers.org/data_models/
http://www.utexas.edu/its/windows/database/datamodeling/rm/rm7.html
|||The problem regarding sizeing the name columns in different tables are partly because i will have to import data from old tables used by the former application.
These tables are very poorly designed(even to my standards ;) ) and have caused users to add more than just name in the namefields so forth. I will control this better programatically but
nevertheless stripping of existing text in the field would not be popular so my thought weher to implement it and try steering all new information in a better way.
I think that this meant i needed that size when examining the old data to prevent truncation.
Anyway the tables above are examples i have more secondary tables and quite a few primary tables as well.
The basic concept i was thinking about if its better practice to have larger tables wich results in less querys / page (merging the sub/secondary tables as much as practically possible) or using smaller, slicker tables and more inpage querys/procs. For example using 1 table to link both contacts and groups and 1 table for perhaps phone/email and even links as long as one can reuse the fields not resulting in empty fields.
Shall check your links btw
|||You could clean the data before importing it and I am concerned you are describing tables on the Calculus end where you have main table and the dimenssions on this end tables sizes should be similar. I know you can use validators but you can also use CHECK CONSTRAINT on you columns so people cannot insert whatever they like in name columns.
http://msdn2.microsoft.com/en-us/library/ms188258.aspx
Tuesday, March 20, 2012
Best Practice: Procedures: (Insert And Update) OR JUST (Save)
I have a Product Table.
And now I have to create its Stored Procedures.
I am asking the best practice regarding the methods Insert And Update.
There are two options.
1. Create separate 2 procedures like InsertProduct and UpdateProduct.
2. Create just 1 procedure like ModifyProduct. In which programmatically check that either the record is present or not. If present then update and if not then insert. Just like Imar has done in his articlehttp://imar.spaanjaars.com/QuickDocId.aspx?quickdoc=419
Can any one explain the better one.
Waiting for helpful replies.
http://imar.spaanjaars.com/QuickDocId.aspx?quickdoc=419
a
There's no "best practice" for this one. Imar presumably likes his "Save" approach because whether you are adding a new record or amending an existing one, generally software applications ask you to click the Save button - so he likes to make his programming logic analogous.
Personally, I prefer theKISS principal, and create 2 separate procedures. It's clear from the interface which one to call as a result of user action. I also see the decision as to whether to Insert or Update as being a business logic decision, and I'm uncomfortable about putting business logic in a stored procedure. The reason for this is that the business logic may not be transferable to another database platform.
I do not understand your last point regardgin Business Logic.
I understand that it should be better in your opinion to create 2 separate procedures.
But what about Business Logic Methods.
|||
zeeshanuddinkhan@.hotmail.com:
I do not understand your last point regardgin Business Logic.
Well, I suppose it depends on how you define "Business Logic". And this illustrates one of the problems with layering an application. The reason why there are so many books and theories on architecture is because there is no "right" way to do it, and definitions of what belongs in which layer are different. Some things so obviously belong in certain layers, but other things might or might not - depending on what you are used to, how you think, what you are told to do by your team leader etc. There is for example, a huge debate about whether stored procedures are a bad thing altogether, because they can be viewed as placing business logic in a database and not in the BLL.
It also depends on how atomic (how much you like to break functionality down into discrete parts - methods, classes, procedures etc) you want your application. Imar would no doubt suggest that the action of the user defines that a Save() method be called, and that while the Save() method can include two alternative actions (Insert or Update), both lead to a row being saved to the database, so it's essentially the same action. The procedure decides whether an existing row is updated or a new one created. I see the difference between Insert and Update as being too different to be combined into one method. Consequently, I break the procedures apart into separate atomic constructs. I view the difference between the 2 as a business logic thing - because I can - and something in my gut tell me it is.
That's purely my view and is neither right or wrong. Others may not agree, and they will no doubt have valid justification for their view. It's right for me but wrong for Imar. And that's why I said at the beginning that there is no Best Practice for Insert or Update v Save. It's purely down to your personal preference. Imar's solution has a certain appeal, in that it contains a certain "cleverness". Some people like that. Nothing wrong with that at all.
Quite often the difference between two alternatives is purely philosophical, and has nothing to do with performance, maintainability or re-useability, which are the three items that Best Practice should be concerned with.
[Edit]
Just re-read my first response and having rambled on above, I see I may have missed your point. If you were asking about transferable business logic, it may be that you have to move the application to a different database system which doesn't support stored procedures, but may support basic INSERT, UPDATE, SELECT and DELETE saved queries. In this case, it wouldn't be too difficult to copy and paste the SQL form each part of the proc, but if you make procs do too much in terms of massaging data, or deciding on a course of action, you will create a load more work in your migration.
You are also perfectly free to ignore this on the basis that "it will never happen". Only you know best.
Sunday, March 11, 2012
Best practice for dbo
Does it really have any importance at all who the owner (as defined by 'dbo') is ?I'd strongly recommend leaving sa as dbo, and if need be then making the user a member of the db_owner role if you need that.
-PatP|||Thanks.
The issue was raised when I noticed that for older user databases, someone had assigned a system admin as the dbo by his own, personal user name. When than person then left, and his user was removed, those user databases became orphans.|||You can assign db_ddladmin.
db_ddladmin act same as dbo but it has limited rights comparing db_owner.|||I suspect that Coolberg's problem wasn't one of permission level (they want the user to be equivalent to dbo), but one of ownership (they don't want the login to "own" the database).
There are two issues here that are tightly intertwined, and often confused.
A login is what gives a person access to SQL Server. Logins exist at the server level, and can be either SQL Authenticated or Windows Authenticated. Logins are what "own" a database.
A User is what gives a person permissions inside a SQL Server database. Users exist only inside a database, and are logically tied to exactly one login on the server.
I think that Coolberg wants to keep the ownership of the database limited to an administrative login. I strongly recommend using sa (because you just about can't delete that login), but I agree with the general idea regardless of what login you use.
By using this strategy, you can keep the database ownership limited to an administrative login, but still make any database users memebers of the db_owner role (giving them exactly the same permissions as dbo).
-PatP|||Thanks.
Yes, I'll go for the sa user.
My main goal is to avoid getting orphanized databases when users are leaving in the future.
Wednesday, March 7, 2012
Best Method to update table...
I've got a little question regarding the speed of an update query...
situation:
I've got different tables containing information wich i want to add to one big table trough a schedule (or as fast as possible).
Bigtable size:
est. 180000 records with 25 fields (most varchar).
Currently I've tried two different methods:
delete all rows in the big table and add the ones from the little tables again. (trough union all query)
-> Speed ~ 15 Seconds
refresh all changed rows (trough timestamp <>) and add new titles (trough union all query)
-> Speed ~ 20 Seconds
Does anybody know a faster solution? The union queries block the table for those 20 Seconds...
Thanks for any reply!RE: situation: I've got different tables containing information which i want to add to one big table trough a schedule (or as fast as possible).
Bigtable size:
est. 180000 records with 25 fields (most varchar).
Currently I've tried two different methods:
delete all rows in the big table and add the ones from the little tables again. (trough union all query)
-> Speed ~ 15 Seconds
refresh all changed rows (trough timestamp <>) and add new titles (trough union all query)
-> Speed ~ 20 Seconds
Q1 Does anybody know a faster solution? The union queries block the table for those 20 Seconds... Thanks for any reply!
A1 Maybe.
As with many things, it depends on the requirements. For example, some possible considerations may include various permutations and combinations of any of the following: (not an exhaustive list)
a using a lower isolation level for the union queries, and conditionally unioning only updated tables
b implementing triggers to update the target as dml is commited at the source tables
c a create, populate, and rename table scheme (dropping the old table)
Best method or is it even worth the effort?
Hi,
I have a question regarding a project I am working on and I'm not sure what the best way to do it may be, I am new to SSIS and would like to gain some exposure using it, but don't know if this project is overkill or even possible using SSIS.
*** PROBLEM ****
Basically I have a large number of flat files each file contains different columns and data. What I want to do is import all of the data from the flat files into corresponding tables in SQL Server. The tables in SQL server are already defined with data structure to match each file. The first 10 characters of the flat file are also the same name as the SQL table. So it seems easy enough to just import flat file to table one by one, but I cannot figure out how to match the flat file structure to each table without setting up a task for each table (there are roughly 50-60 tables). Another reason I want to do it with SSIS is these flat files have to be imported on a weekly basis.
**********************
Is it in my best interest to just create a stored procedure that basically loops through the files and imports the data? Or do you think using SSIS is the best solution? I have looked at SSIS a bit and thought maybe I could use the bulkinsert task, but it appears it cannot dynamically change due to column specs. I figured out how to loop through the flat files using the foreach task, but I could only get it to import into one SQL table. I also started looking at the dataflow task and thought about creating a script task, but I'm not sure if that would work before I spend a lot of time trying to figure that out.
Any suggestions or thoughts are appreciated.
Thank you
Are you saying that there are many record layouts/formats in the same file? Or one file has only one format, but there are multiple formats and multiple files with those formats?
Either way, if you have 50 file formats, seems like you're going to have to configure 50 flat file source connections. There may be a good way with the script task, but that's not my strength. But I would choose SSIS over other method - at the least it sounds like you should look into the conditional split transformation, and variables. Let us know about the above question.
|||To implement this without scripting (I'm under the impression that your 50 files / tables have different formats), you'd need to create 50 data flows.
It certainly could be implemented via a script, but I'm not sure what benefit you'd get from writing the script in SSIS versus just creating a VB or C# application.
|||Not sure what it is you are doing, but it sounds EXACTLY like what I have recently done - except with 45 files/tables. I started to go down the route of individual DFs, but instead created individual packages for each of the files I have to process. Either way will work. Basically, we have a parent package that kicks off the children packages to process each file. This is done through iterating through our known files for each organization that will submit these 45 files.
This is working as designed and have not ran into any troubles. We are still working on getting some better error logging now. Please feel free to post if you have any questions.
Regards
|||I was hoping to create something a little less cumbersome, although I think that will work and may be my last resort. I have something right now that seems to work somewhat I need to do further testing and development to see if it will meet my needs.
What I have done is create a simple foreach loop and inside the foreach loop I have a script task that takes the return value from the foreach loop which is the file name, and parses the string into values that I can pass into the bulk insert task.
In my script I set the Dts.Connections.Item("BulkFile").ConnectionString to the filename variable from the foreach loop and then I parse the string to get the name of my table and set a user defined variable I just called "TableName" with the parsed string value. I then open the Bulk Insert Task and set the DestinationTableName expression to my user defined variable "TableName".
My initial findings appear to work well, but so far I have only imported six of the table dynamically I still have 50+ to go. I have to create some stored procedures to clean out the tables before the bulk insert due to primary key constraints etc... but basically once a week I am flushing all of the data in these tables and will be replacing them with the data from the flat files.
Sunday, February 19, 2012
Benefits of SQL authentication?
SP4
I can find lots of info regarding the justification to use WINNT instead of
SQL authentication. Are there any good reasons to use SQL authentication
instead?
TIA, ChrisRWeb applications, Mac or Linux users, or any situation where the users are
not members of the domain.
Arnie Rowland, Ph.D.
Westwood Consulting, Inc
Most good judgment comes from experience.
Most experience comes from bad judgment.
- Anonymous
"ChrisR" <ChrisR@.discussions.microsoft.com> wrote in message
news:AFC675ED-FE57-41DD-8CF9-F494E112E95A@.microsoft.com...
> SQL2K
> SP4
> I can find lots of info regarding the justification to use WINNT instead
> of
> SQL authentication. Are there any good reasons to use SQL authentication
> instead?
> TIA, ChrisR|||Thanks, I should have been more specific. I know there are times when users
MUST use SQL auth. However, if it could go either way, what would the
benefits be to SQL auth?
"Arnie Rowland" wrote:
> Web applications, Mac or Linux users, or any situation where the users are
> not members of the domain.
> --
> Arnie Rowland, Ph.D.
> Westwood Consulting, Inc
> Most good judgment comes from experience.
> Most experience comes from bad judgment.
> - Anonymous
>
> "ChrisR" <ChrisR@.discussions.microsoft.com> wrote in message
> news:AFC675ED-FE57-41DD-8CF9-F494E112E95A@.microsoft.com...
>
>|||If you have a choice, the pros are for Windows authentication, and the cons
are for SQL Authentication.
I, honestly, can't recall situations where SQL Authentication is the best
choice over Windows Authentication -unless SQL Authentication is required
for some specific need.
Arnie Rowland, Ph.D.
Westwood Consulting, Inc
Most good judgment comes from experience.
Most experience comes from bad judgment.
- Anonymous
"ChrisR" <ChrisR@.discussions.microsoft.com> wrote in message
news:F115C81F-55BA-46B1-A7AE-F156BD815A95@.microsoft.com...[vbcol=seagreen]
> Thanks, I should have been more specific. I know there are times when
> users
> MUST use SQL auth. However, if it could go either way, what would the
> benefits be to SQL auth?
>
> "Arnie Rowland" wrote:
>|||There aren't any benefits aside from needing to use it. You wouldn't
'choose' SQL Auth over Windows auth if you had both as a viable option -
ever. SQL Auth is a subset of Windows Auth - not vice versa.
"ChrisR" <ChrisR@.discussions.microsoft.com> wrote in message
news:F115C81F-55BA-46B1-A7AE-F156BD815A95@.microsoft.com...[vbcol=seagreen]
> Thanks, I should have been more specific. I know there are times when
> users
> MUST use SQL auth. However, if it could go either way, what would the
> benefits be to SQL auth?
>
> "Arnie Rowland" wrote:
>
Benefits of SQL authentication?
SP4
I can find lots of info regarding the justification to use WINNT instead of
SQL authentication. Are there any good reasons to use SQL authentication
instead?
TIA, ChrisRWeb applications, Mac or Linux users, or any situation where the users are
not members of the domain.
--
Arnie Rowland, Ph.D.
Westwood Consulting, Inc
Most good judgment comes from experience.
Most experience comes from bad judgment.
- Anonymous
"ChrisR" <ChrisR@.discussions.microsoft.com> wrote in message
news:AFC675ED-FE57-41DD-8CF9-F494E112E95A@.microsoft.com...
> SQL2K
> SP4
> I can find lots of info regarding the justification to use WINNT instead
> of
> SQL authentication. Are there any good reasons to use SQL authentication
> instead?
> TIA, ChrisR|||Thanks, I should have been more specific. I know there are times when users
MUST use SQL auth. However, if it could go either way, what would the
benefits be to SQL auth?
"Arnie Rowland" wrote:
> Web applications, Mac or Linux users, or any situation where the users are
> not members of the domain.
> --
> Arnie Rowland, Ph.D.
> Westwood Consulting, Inc
> Most good judgment comes from experience.
> Most experience comes from bad judgment.
> - Anonymous
>
> "ChrisR" <ChrisR@.discussions.microsoft.com> wrote in message
> news:AFC675ED-FE57-41DD-8CF9-F494E112E95A@.microsoft.com...
> > SQL2K
> > SP4
> >
> > I can find lots of info regarding the justification to use WINNT instead
> > of
> > SQL authentication. Are there any good reasons to use SQL authentication
> > instead?
> >
> > TIA, ChrisR
>
>|||If you have a choice, the pros are for Windows authentication, and the cons
are for SQL Authentication.
I, honestly, can't recall situations where SQL Authentication is the best
choice over Windows Authentication -unless SQL Authentication is required
for some specific need.
--
Arnie Rowland, Ph.D.
Westwood Consulting, Inc
Most good judgment comes from experience.
Most experience comes from bad judgment.
- Anonymous
"ChrisR" <ChrisR@.discussions.microsoft.com> wrote in message
news:F115C81F-55BA-46B1-A7AE-F156BD815A95@.microsoft.com...
> Thanks, I should have been more specific. I know there are times when
> users
> MUST use SQL auth. However, if it could go either way, what would the
> benefits be to SQL auth?
>
> "Arnie Rowland" wrote:
>> Web applications, Mac or Linux users, or any situation where the users
>> are
>> not members of the domain.
>> --
>> Arnie Rowland, Ph.D.
>> Westwood Consulting, Inc
>> Most good judgment comes from experience.
>> Most experience comes from bad judgment.
>> - Anonymous
>>
>> "ChrisR" <ChrisR@.discussions.microsoft.com> wrote in message
>> news:AFC675ED-FE57-41DD-8CF9-F494E112E95A@.microsoft.com...
>> > SQL2K
>> > SP4
>> >
>> > I can find lots of info regarding the justification to use WINNT
>> > instead
>> > of
>> > SQL authentication. Are there any good reasons to use SQL
>> > authentication
>> > instead?
>> >
>> > TIA, ChrisR
>>|||There aren't any benefits aside from needing to use it. You wouldn't
'choose' SQL Auth over Windows auth if you had both as a viable option -
ever. SQL Auth is a subset of Windows Auth - not vice versa.
"ChrisR" <ChrisR@.discussions.microsoft.com> wrote in message
news:F115C81F-55BA-46B1-A7AE-F156BD815A95@.microsoft.com...
> Thanks, I should have been more specific. I know there are times when
> users
> MUST use SQL auth. However, if it could go either way, what would the
> benefits be to SQL auth?
>
> "Arnie Rowland" wrote:
>> Web applications, Mac or Linux users, or any situation where the users
>> are
>> not members of the domain.
>> --
>> Arnie Rowland, Ph.D.
>> Westwood Consulting, Inc
>> Most good judgment comes from experience.
>> Most experience comes from bad judgment.
>> - Anonymous
>>
>> "ChrisR" <ChrisR@.discussions.microsoft.com> wrote in message
>> news:AFC675ED-FE57-41DD-8CF9-F494E112E95A@.microsoft.com...
>> > SQL2K
>> > SP4
>> >
>> > I can find lots of info regarding the justification to use WINNT
>> > instead
>> > of
>> > SQL authentication. Are there any good reasons to use SQL
>> > authentication
>> > instead?
>> >
>> > TIA, ChrisR
>>
Benchmarks for XML EXPLICIT vs. standard recordset?
I was wondering if anyone had links and/or information regarding SQL Server
2000 performance benchmarks when it comes to using FOR XML EXPLICIT
techniques versus using a standard recordset. I've built a number of stored
procs at work using FOR XML EXPLICIT and it's been a huge time-saver. But
alas, the DBAs are unfamiliar (and thus "uncomfortable") with my use of these
techniques.
The alternative, manually building an XML document from recordsets on the VB.
NET side, seems sloppy and cumbersome to me. I'm hoping I can garner some
ammunition that supports FOR XML EXPLICIT.I haven't seen any performance figures (but you might want to re-post in the
.sqlserver.xml group just to make sure). If I were you, I would run some
load tests on both the XML procedures and equivalent rowset procedures to
show whether or not the XML will cause a performance problem.
Adam Machanic
SQL Server MVP
http://www.datamanipulation.net
--
"Frefaln via SQLMonster.com" <forum@.SQLMonster.com> wrote in message
news:52BBD3DFC1848@.SQLMonster.com...
> Hello all.
> I was wondering if anyone had links and/or information regarding SQL
Server
> 2000 performance benchmarks when it comes to using FOR XML EXPLICIT
> techniques versus using a standard recordset. I've built a number of
stored
> procs at work using FOR XML EXPLICIT and it's been a huge time-saver. But
> alas, the DBAs are unfamiliar (and thus "uncomfortable") with my use of
these
> techniques.
> The alternative, manually building an XML document from recordsets on the
VB.
> NET side, seems sloppy and cumbersome to me. I'm hoping I can garner some
> ammunition that supports FOR XML EXPLICIT.
Benchmarks for XML EXPLICIT vs. standard recordset?
I was wondering if anyone had links and/or information regarding SQL Server
2000 performance benchmarks when it comes to using FOR XML EXPLICIT
techniques versus using a standard recordset. I've built a number of stored
procs at work using FOR XML EXPLICIT and it's been a huge time-saver. But
alas, the DBAs are unfamiliar (and thus "uncomfortable") with my use of these
techniques.
The alternative, manually building an XML document from recordsets on the VB.
NET side, seems sloppy and cumbersome to me. I'm hoping I can garner some
ammunition that supports FOR XML EXPLICIT.
I haven't seen any performance figures (but you might want to re-post in the
..sqlserver.xml group just to make sure). If I were you, I would run some
load tests on both the XML procedures and equivalent rowset procedures to
show whether or not the XML will cause a performance problem.
Adam Machanic
SQL Server MVP
http://www.datamanipulation.net
"Frefaln via droptable.com" <forum@.droptable.com> wrote in message
news:52BBD3DFC1848@.droptable.com...
> Hello all.
> I was wondering if anyone had links and/or information regarding SQL
Server
> 2000 performance benchmarks when it comes to using FOR XML EXPLICIT
> techniques versus using a standard recordset. I've built a number of
stored
> procs at work using FOR XML EXPLICIT and it's been a huge time-saver. But
> alas, the DBAs are unfamiliar (and thus "uncomfortable") with my use of
these
> techniques.
> The alternative, manually building an XML document from recordsets on the
VB.
> NET side, seems sloppy and cumbersome to me. I'm hoping I can garner some
> ammunition that supports FOR XML EXPLICIT.
Benchmarks for XML EXPLICIT vs. standard recordset?
I was wondering if anyone had links and/or information regarding SQL Server
2000 performance benchmarks when it comes to using FOR XML EXPLICIT
techniques versus using a standard recordset. I've built a number of stored
procs at work using FOR XML EXPLICIT and it's been a huge time-saver. But
alas, the DBAs are unfamiliar (and thus "uncomfortable") with my use of thes
e
techniques.
The alternative, manually building an XML document from recordsets on the VB
.
NET side, seems sloppy and cumbersome to me. I'm hoping I can garner some
ammunition that supports FOR XML EXPLICIT.I haven't seen any performance figures (but you might want to re-post in the
.sqlserver.xml group just to make sure). If I were you, I would run some
load tests on both the XML procedures and equivalent rowset procedures to
show whether or not the XML will cause a performance problem.
Adam Machanic
SQL Server MVP
http://www.datamanipulation.net
--
"Frefaln via droptable.com" <forum@.droptable.com> wrote in message
news:52BBD3DFC1848@.droptable.com...
> Hello all.
> I was wondering if anyone had links and/or information regarding SQL
Server
> 2000 performance benchmarks when it comes to using FOR XML EXPLICIT
> techniques versus using a standard recordset. I've built a number of
stored
> procs at work using FOR XML EXPLICIT and it's been a huge time-saver. But
> alas, the DBAs are unfamiliar (and thus "uncomfortable") with my use of
these
> techniques.
> The alternative, manually building an XML document from recordsets on the
VB.
> NET side, seems sloppy and cumbersome to me. I'm hoping I can garner some
> ammunition that supports FOR XML EXPLICIT.
Thursday, February 16, 2012
Behavior question about updateable resultsets...
resultsets. If I update a column using any of the updateXXX methods and
then try to use the getXXX methods from the same column to see if it
updated the results locally and not on the server, I get the same old
value. I have to call updateRow() but that updates the underlying
database and still gives me the old value until I execute the same
query again and get a new resultset. Maybe the code below will clarify
my question more..
Connection con = null;
Statement stmt;
ResultSet rst;
Class.forName("com.microsoft.jdbc.sqlserver.SQLSer verDriver");
System.out.println("Getting connection.");
con = DriverManager.getConnection(url);
System.out.println("Connection successful.");
String st = "select age,sname,snum FROM student;";
stmt =
con.createStatement(ResultSet.TYPE_SCROLL_SENSITIV E,ResultSet.CONCUR_UPDATABLE);
rst = stmt.executeQuery(st);
rst.last();
System.out.print(rst.getInt(1)+" ");
System.out.print(rst.getString(2)+" ");
System.out.print(rst.getLong(3)+"\n");
rst.updateInt(1,23);
rst.updateRow();
System.out.print(rst.getInt(1)+" ");
System.out.print(rst.getString(2)+" ");
System.out.print(rst.getLong(3)+"\n");
The output is:
Getting connection.
Connection successful.
25 Edward Baker 578875478
25 Edward Baker 578875478
If I were the run the same code again, I get:
Getting connection.
Connection successful.
23 Edward Baker 578875478
23 Edward Baker 578875478
Any/all help is appreciated
Thanks
Devansh Dhutia
University of Iowa
This is a bug and it does not have a trivial fix. I would like to encourage
you to file this using the product feedback website (below).
The problem here is that there are two mutually exclusive places where
column values transit through the driver. The first, used only by getters,
is through the columns array (lives on the statement). The second, used
only by setters, is through the colParam array (also lives on the
statement). The columns array is read only the colParam array is write
only...
Note that the JDBC spec provides (in section 27.1.22, p. 718 - 719 JDBC API
Tutorial and Reference, Third edition, (Fisher, Ellis, Bruce)) that a result
set's own updates need not be visible to it. Obviously we would not like
this to be the default behavior but it is going to take a lot of work and it
would help to have customer feedback that clarified why this behavior should
be changed.
Entering a bug:
Go to http://lab.msdn.microsoft.com/produc...k/default.aspx
Product/Technology:
SQL Server 2005
Category:
JDBC Driver
Make sure to add JDBC SqlServer 2005 to the but title.
Angel Saenz-Badillos [MS] DataWorks
This posting is provided "AS IS", with no warranties, and confers no
rights.Please do not send email directly to this alias.
This alias is for newsgroup purposes only.
I am now blogging: http://weblogs.asp.net/angelsb/
<devansh.dhutia@.gmail.com> wrote in message
news:1128188361.813553.300740@.g47g2000cwa.googlegr oups.com...
>I have a question regarding a certain behavior of updateable
> resultsets. If I update a column using any of the updateXXX methods and
> then try to use the getXXX methods from the same column to see if it
> updated the results locally and not on the server, I get the same old
> value. I have to call updateRow() but that updates the underlying
> database and still gives me the old value until I execute the same
> query again and get a new resultset. Maybe the code below will clarify
> my question more..
> Connection con = null;
> Statement stmt;
> ResultSet rst;
> Class.forName("com.microsoft.jdbc.sqlserver.SQLSer verDriver");
> System.out.println("Getting connection.");
> con = DriverManager.getConnection(url);
> System.out.println("Connection successful.");
> String st = "select age,sname,snum FROM student;";
> stmt =
> con.createStatement(ResultSet.TYPE_SCROLL_SENSITIV E,ResultSet.CONCUR_UPDATABLE);
> rst = stmt.executeQuery(st);
> rst.last();
> System.out.print(rst.getInt(1)+" ");
> System.out.print(rst.getString(2)+" ");
> System.out.print(rst.getLong(3)+"\n");
> rst.updateInt(1,23);
> rst.updateRow();
> System.out.print(rst.getInt(1)+" ");
> System.out.print(rst.getString(2)+" ");
> System.out.print(rst.getLong(3)+"\n");
> The output is:
> Getting connection.
> Connection successful.
> 25 Edward Baker 578875478
> 25 Edward Baker 578875478
> If I were the run the same code again, I get:
> Getting connection.
> Connection successful.
> 23 Edward Baker 578875478
> 23 Edward Baker 578875478
> Any/all help is appreciated
> Thanks
> Devansh Dhutia
> University of Iowa
>