Showing posts with label inserting. Show all posts
Showing posts with label inserting. Show all posts

Tuesday, March 20, 2012

Best Practices for Insert/Update/Delete

for now, doing a small school project, i find doing SPs for Insert useful, like checking for existing data and not inserting, that might not be the best method, i had advice from here i can use unique constraints instead, then what about update and delete? SPs also? the pros make SPs for everything? currently use dynamically generated SQL from SqlDataSources. for Update / delete. some delete are SPs too...

My 2 cents... SPs are a great way to interface with a database. For me, I will never access a DB in any other way. In fact, I would say it is good practice to secure the database so that only the defined stored procedures can be Executed against the DB. No direct table reads or writes. This will ensure that no one (other than an errant DBA) can do anything other than what is inteded by the interface provided through the stored procedures. It is essentially just another layer in the application model. Also, it provides a level of reusability, and design hiding... generally considered good things. An application will essentially only need to make "function calls" on the database, rather than some nasty select statement that, oh, by the way, I split this table out into two seperate ones, so now you have to go rewrite all your queries in every application that was ever written that uses my database because I didn't just write stored procedures in the first place, which, looking back would have been smart because then I would only have to fix things in two places.

// of course, you probably could create a view with the same name as the old table to fix it, but, just making a point

|||

I think you should use Stored Procedures whenever possible. But for making sure that the value in a column is unique you should always use unique constraints etc. For insert/update/delete always use SPs.

sql

Sunday, March 11, 2012

Best practice for conditional insert else select?

I have several places where I need to get the id (primary key) of a resource, inserting a row if the resource does not exist (i.e. an artificial key to be used as an FK for another table). I should probably change this varchar key lookup to use a hash index, but that is beside the point.

So the table is essentially like:

CREATE TABLE MyLookup(id int identity primary key nonclustered, mykey varchar(256));

CREATE CLUSTERED INDEX mylookup_cidx_mykey ON MyLookup(mykey);

I see two main approaches for how I can do my get-id-with-insert-if-needed.

(Approach 1)

DECLARE @.id INT;

SELECT @.id = id FROM MyLookup WHERE mykey = 'some key value';

IF (@.id is null)

BEGIN

INSERT MyLookup ('some key value');

SET @.id = SCOPE_IDENTITY();

END

(Approach 2)

DECLARE @.id INT;

INSERT MyLookup SELECT 'some key value' WHERE NOT EXISTS (SELECT id FROM MyLookup WHERE mykey = 'some key value');

IF (@.@.ROWCOUNT = 0)

SELECT @.id = id FROM MyLookup WHERE mykey = 'some key value';

ELSE

SET @.id = SCOPE_IDENTITY();

From some quick tests in profiler, approach 2 seems to be a bit faster and have lower resource utilization. But I'm not sure if it maybe takes some more aggressive locks even in the unnecessary case where the mykey row value of 'some key value' already exists. Approach 2 also looks cleaner to me, but I don't mind a bit of extra code if it gives me better scalability through less lock contention.

Any tip on what is considered the best practice for a conditional insert like this, or a tip on how to get detailed lock info for a query? The lock info for profiler was all greek to me, it just had a hex value with each lock acquired/released, so I have no idea what it was telling me. Is my only solution to just run exhaustive tests and look at the perf numbers from the black box?

I went ahead and did some testing, no big surprises. Although the tests were single client, so it doesn't give me any info about locking.

I found unsurprisingly that the more sparse the table, the narrower the gap between the two options. But as the likelihood of needing an insert went down, the first approach became more effective. In my production environment I would hazard a guess that I need an insert around 5% of the time, so I'll probably go with approach 1 in general. Although I did change my approach 1 to actually incorporate approach 2 with in - first I do a select, and then if the id was null then I do a conditional insert. There is a slight increase in maintenance cost since I have to duplicate the code for the existential check, but I think it is worthwhile in my cases that are fairly high traffic.

Any other points of view on this?

Saturday, February 25, 2012

Best GUID Storage

Because of the problem getting IDENTITY primary key values back when
inserting batches of rows, I would like to experiment with using
GUIDs. Within an application, I would like to assign the primary keys
to the rows and then pass them into the INSERT statements. Then I
wouldn't have to worry about using triggers or Identity scope to
determine the new primary keys.
My question is basically, what's the best datatype to store the GUIDs
in the column? From what I've read so far, it looks like
UniqueIdentifier or CHAR(40) are my options. Is there any drawback to
using UniqueIdentifier?If you are storing a GUID, then why not use Uniqueidentifier data type.
In SQL Server Books Online, read the page titled "Using uniqueidentifier
Data". This page discusses the advantages and disadvantages of this
datatype.
--
HTH,
Vyas, MVP (SQL Server)
http://vyaskn.tripod.com/
Is .NET important for a database professional?
http://vyaskn.tripod.com/poll.htm
"- TW" <Thumper@.kqrsrocks.com> wrote in message
news:151a6e6b.0402231323.29f24d45@.posting.google.com...
Because of the problem getting IDENTITY primary key values back when
inserting batches of rows, I would like to experiment with using
GUIDs. Within an application, I would like to assign the primary keys
to the rows and then pass them into the INSERT statements. Then I
wouldn't have to worry about using triggers or Identity scope to
determine the new primary keys.
My question is basically, what's the best datatype to store the GUIDs
in the column? From what I've read so far, it looks like
UniqueIdentifier or CHAR(40) are my options. Is there any drawback to
using UniqueIdentifier?|||TW,
I've used both, and it nearly always comes down to interoperability.
Some systems cannot deal with binary so you have to go varchar(36)/char(36).
Where did you get 40, incidentally?
James Hokes
"- TW" <Thumper@.kqrsrocks.com> wrote in message
news:151a6e6b.0402231323.29f24d45@.posting.google.com...
> Because of the problem getting IDENTITY primary key values back when
> inserting batches of rows, I would like to experiment with using
> GUIDs. Within an application, I would like to assign the primary keys
> to the rows and then pass them into the INSERT statements. Then I
> wouldn't have to worry about using triggers or Identity scope to
> determine the new primary keys.
> My question is basically, what's the best datatype to store the GUIDs
> in the column? From what I've read so far, it looks like
> UniqueIdentifier or CHAR(40) are my options. Is there any drawback to
> using UniqueIdentifier?|||Excellent choice to use Guids, imho.
Vyas has already pointed you to a good article on the topic, but here are a
couple of extra things not mentioned in that article:
(a) A benefit of using Guids instead of Identities is that if you ever need
to implement horizontal partitioning on the table, it will be substantially
easier with Guids. With Guids, the partitioning process is virtually
seemless to the application but partitioning tables with Identities nearly
always breaks the application.
(b) On the other hand, a problem with using Guids which is not mentioned in
that article is that T-SQL has no ISGUID() type function which causes minor
coding issues. Of course, it's possible to roll your own though.
Regards,
Greg Linwood
SQL Server MVP
"- TW" <Thumper@.kqrsrocks.com> wrote in message
news:151a6e6b.0402231323.29f24d45@.posting.google.com...
> Because of the problem getting IDENTITY primary key values back when
> inserting batches of rows, I would like to experiment with using
> GUIDs. Within an application, I would like to assign the primary keys
> to the rows and then pass them into the INSERT statements. Then I
> wouldn't have to worry about using triggers or Identity scope to
> determine the new primary keys.
> My question is basically, what's the best datatype to store the GUIDs
> in the column? From what I've read so far, it looks like
> UniqueIdentifier or CHAR(40) are my options. Is there any drawback to
> using UniqueIdentifier?

Sunday, February 19, 2012

Benckmark. Inserting records.

How may inserts can SQL execute in a second?
The table that's inserting into has 3 fields (numeric, char(15) and
datetime)
and no other kind of SQL statements are running against it.
Hardware: quad proc, 2GB RAM, RAID.
TIA,
Nicthis is one of those 'it depends' issues...
depedning on the speed of your disks, number of indexes, other users on the
system, blocking, etc...
you could easily do hundreds or a few thousands per second on high end
hardware.
On 2 procs... I'd be thinking more in the range of hundreds... but only
testing will know for sure.
--
Brian Moran
Principal Mentor
Solid Quality Learning
SQL Server MVP
http://www.solidqualitylearning.com
"BN" <nc@.abc.com> wrote in message
news:e9hUkZcXDHA.652@.TK2MSFTNGP10.phx.gbl...
> How may inserts can SQL execute in a second?
> The table that's inserting into has 3 fields (numeric, char(15) and
> datetime)
> and no other kind of SQL statements are running against it.
> Hardware: quad proc, 2GB RAM, RAID.
> TIA,
> Nic
>|||First, paralellism is the key to optimizing data loading performance.
Create your insert routine so that it can easily be partitioned and
"paralellized".
Depending on your application, you might also consider using bulk insert,
bcp, or DTS - it should be possible to achieve 10's of 1000's of rows per
second with a table that narrow. I would estimate with a midrange server
you could easily go 50K/sec with bulk insert into an empty heap with only a
couple streams.
----
The views expressed here are my own
and not of my employer.
----
"Brian Moran" <brian@.solidqualitylearning.com> wrote in message
news:esoc4fcXDHA.3924@.tk2msftngp13.phx.gbl...
> this is one of those 'it depends' issues...
> depedning on the speed of your disks, number of indexes, other users on
the
> system, blocking, etc...
> you could easily do hundreds or a few thousands per second on high end
> hardware.
> On 2 procs... I'd be thinking more in the range of hundreds... but only
> testing will know for sure.
> --
> Brian Moran
> Principal Mentor
> Solid Quality Learning
> SQL Server MVP
> http://www.solidqualitylearning.com
>
> "BN" <nc@.abc.com> wrote in message
> news:e9hUkZcXDHA.652@.TK2MSFTNGP10.phx.gbl...
> > How may inserts can SQL execute in a second?
> >
> > The table that's inserting into has 3 fields (numeric, char(15) and
> > datetime)
> > and no other kind of SQL statements are running against it.
> >
> > Hardware: quad proc, 2GB RAM, RAID.
> >
> > TIA,
> >
> > Nic
> >
> >
>|||on a 2x2.4, using individual stored proc calls per single
line insert, i can get > 7K/sec using 10 separate threads
by consolidating more than 1 single row insert statement
into each stored procedure, >18k single row inserts/sec is
possible,
>30k rows/sec on multi-row inserts,
if you are a doing more than one single row insert in a
single stored proc., try using BEGIN/COMMIT TRAN even if
it is not required, this consolidates the transaction log
writes
go to the next sql server magazine connections conference
for more info, brian is there as well
www.sqlconnections.com
>--Original Message--
>How may inserts can SQL execute in a second?
>The table that's inserting into has 3 fields (numeric,
char(15) and
>datetime)
>and no other kind of SQL statements are running against
it.
>Hardware: quad proc, 2GB RAM, RAID.
>TIA,
>Nic
>
>.
>

Monday, February 13, 2012

Beginner question about inserting records

Hi,
I am new to SQL Server (version 8) and hope that someone can help me
solve a problem.
A colleague has set up a SQL Server database with several tables for
which he has constructed multiple applications. As a result, their
structure cannot be altered in any way.
I have received updates for the data -- but the updates are in dbase
files. I would like to insert these dbase files into the existing table
structures (a simple thing to do in Foxpro), but I can't seem to figure
out how to do it in SQL Server.
Can someone point me in the right direction? (I've searched the
archives and gotten a few hints about bulk insert but I'm missing some
critical piece of the puzzle.)
Many thanks,
Jo
<jacmcginty@.gmail.com> wrote in message
news:1116859509.543432.227390@.g14g2000cwa.googlegr oups.com...
> Hi,
> I am new to SQL Server (version 8) and hope that someone can help me
> solve a problem.
> A colleague has set up a SQL Server database with several tables for
> which he has constructed multiple applications. As a result, their
> structure cannot be altered in any way.
> I have received updates for the data -- but the updates are in dbase
> files. I would like to insert these dbase files into the existing table
> structures (a simple thing to do in Foxpro), but I can't seem to figure
> out how to do it in SQL Server.
> Can someone point me in the right direction? (I've searched the
> archives and gotten a few hints about bulk insert but I'm missing some
> critical piece of the puzzle.)
> Many thanks,
> Jo
>
Jo, Take a look at BULK INSERT and bcp from the books online. If these
will not suffice and you must perform transformations (data validation,
re-sequencing of keys, normalizing values (yes = 1, no = 0 etc).on the data
as it comes in., then take a look at Data Transformation Services (DTS).
DTS has a wizard for doing imports like you are talking about.
HTH
Rick Sawtell
MCT, MCSD, MCDBA
|||BULK INSERT is a TSQL command, but it doesn't understand complex file formats. What you should be
looking at is "DTS". which is a COM objects for import/export. There are three tools in SLQ Server
that uses this COM object:
DTS Wizard. This is the easiest one to use. Start here.
Package Designer. Use this after you done a few wizards.
DTSRUN.EXE. Schedule a package created with any of above two.
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
<jacmcginty@.gmail.com> wrote in message
news:1116859509.543432.227390@.g14g2000cwa.googlegr oups.com...
> Hi,
> I am new to SQL Server (version 8) and hope that someone can help me
> solve a problem.
> A colleague has set up a SQL Server database with several tables for
> which he has constructed multiple applications. As a result, their
> structure cannot be altered in any way.
> I have received updates for the data -- but the updates are in dbase
> files. I would like to insert these dbase files into the existing table
> structures (a simple thing to do in Foxpro), but I can't seem to figure
> out how to do it in SQL Server.
> Can someone point me in the right direction? (I've searched the
> archives and gotten a few hints about bulk insert but I'm missing some
> critical piece of the puzzle.)
> Many thanks,
> Jo
>
|||Hi, and thank you. The DTS wizard did the trick for this particular
problem (although I see that bulk insert may be helpful in the future).
May I ask a couple of follow-up questions regarding DTS?
1. Is it possible, using the DTS wizard, to change a field name on the
fly?
For example, some of the data tables that I am updating include
underscores in the field names. I need to remove the underscores to
make the updates compatible with previous issues of the data. (I could
do it manually after the import is completed, but since I expect to
repeat this exercise frequently, it would be helpful if I could make
this change during the import.)
2. Once a DTS package has been saved, can it be edited?
For example, I realized after I had saved a DTS and run it that one of
my field types was incorrect, but I couldn't see how to edit the DTS.
(I ended up recreating the whole thing to fix the error.)
Many thanks,
Jo

Beginner question about inserting records

Hi,
I am new to SQL Server (version 8) and hope that someone can help me
solve a problem.
A colleague has set up a SQL Server database with several tables for
which he has constructed multiple applications. As a result, their
structure cannot be altered in any way.
I have received updates for the data -- but the updates are in dbase
files. I would like to insert these dbase files into the existing table
structures (a simple thing to do in Foxpro), but I can't seem to figure
out how to do it in SQL Server.
Can someone point me in the right direction? (I've searched the
archives and gotten a few hints about bulk insert but I'm missing some
critical piece of the puzzle.)
Many thanks,
Jo<jacmcginty@.gmail.com> wrote in message
news:1116859509.543432.227390@.g14g2000cwa.googlegroups.com...
> Hi,
> I am new to SQL Server (version 8) and hope that someone can help me
> solve a problem.
> A colleague has set up a SQL Server database with several tables for
> which he has constructed multiple applications. As a result, their
> structure cannot be altered in any way.
> I have received updates for the data -- but the updates are in dbase
> files. I would like to insert these dbase files into the existing table
> structures (a simple thing to do in Foxpro), but I can't seem to figure
> out how to do it in SQL Server.
> Can someone point me in the right direction? (I've searched the
> archives and gotten a few hints about bulk insert but I'm missing some
> critical piece of the puzzle.)
> Many thanks,
> Jo
>
Jo, Take a look at BULK INSERT and bcp from the books online. If these
will not suffice and you must perform transformations (data validation,
re-sequencing of keys, normalizing values (yes = 1, no = 0 etc).on the data
as it comes in., then take a look at Data Transformation Services (DTS).
DTS has a wizard for doing imports like you are talking about.
HTH
Rick Sawtell
MCT, MCSD, MCDBA|||BULK INSERT is a TSQL command, but it doesn't understand complex file format
s. What you should be
looking at is "DTS". which is a COM objects for import/export. There are thr
ee tools in SLQ Server
that uses this COM object:
DTS Wizard. This is the easiest one to use. Start here.
Package Designer. Use this after you done a few wizards.
DTSRUN.EXE. Schedule a package created with any of above two.
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
<jacmcginty@.gmail.com> wrote in message
news:1116859509.543432.227390@.g14g2000cwa.googlegroups.com...
> Hi,
> I am new to SQL Server (version 8) and hope that someone can help me
> solve a problem.
> A colleague has set up a SQL Server database with several tables for
> which he has constructed multiple applications. As a result, their
> structure cannot be altered in any way.
> I have received updates for the data -- but the updates are in dbase
> files. I would like to insert these dbase files into the existing table
> structures (a simple thing to do in Foxpro), but I can't seem to figure
> out how to do it in SQL Server.
> Can someone point me in the right direction? (I've searched the
> archives and gotten a few hints about bulk insert but I'm missing some
> critical piece of the puzzle.)
> Many thanks,
> Jo
>|||Hi, and thank you. The DTS wizard did the trick for this particular
problem (although I see that bulk insert may be helpful in the future).
May I ask a couple of follow-up questions regarding DTS?
1. Is it possible, using the DTS wizard, to change a field name on the
fly?
For example, some of the data tables that I am updating include
underscores in the field names. I need to remove the underscores to
make the updates compatible with previous issues of the data. (I could
do it manually after the import is completed, but since I expect to
repeat this exercise frequently, it would be helpful if I could make
this change during the import.)
2. Once a DTS package has been saved, can it be edited?
For example, I realized after I had saved a DTS and run it that one of
my field types was incorrect, but I couldn't see how to edit the DTS.
(I ended up recreating the whole thing to fix the error.)
Many thanks,
Jo

Beginner question about inserting records

Hi,
I am new to SQL Server (version 8) and hope that someone can help me
solve a problem.
A colleague has set up a SQL Server database with several tables for
which he has constructed multiple applications. As a result, their
structure cannot be altered in any way.
I have received updates for the data -- but the updates are in dbase
files. I would like to insert these dbase files into the existing table
structures (a simple thing to do in Foxpro), but I can't seem to figure
out how to do it in SQL Server.
Can someone point me in the right direction? (I've searched the
archives and gotten a few hints about bulk insert but I'm missing some
critical piece of the puzzle.)
Many thanks,
Jo<jacmcginty@.gmail.com> wrote in message
news:1116859509.543432.227390@.g14g2000cwa.googlegroups.com...
> Hi,
> I am new to SQL Server (version 8) and hope that someone can help me
> solve a problem.
> A colleague has set up a SQL Server database with several tables for
> which he has constructed multiple applications. As a result, their
> structure cannot be altered in any way.
> I have received updates for the data -- but the updates are in dbase
> files. I would like to insert these dbase files into the existing table
> structures (a simple thing to do in Foxpro), but I can't seem to figure
> out how to do it in SQL Server.
> Can someone point me in the right direction? (I've searched the
> archives and gotten a few hints about bulk insert but I'm missing some
> critical piece of the puzzle.)
> Many thanks,
> Jo
>
Jo, Take a look at BULK INSERT and bcp from the books online. If these
will not suffice and you must perform transformations (data validation,
re-sequencing of keys, normalizing values (yes = 1, no = 0 etc).on the data
as it comes in., then take a look at Data Transformation Services (DTS).
DTS has a wizard for doing imports like you are talking about.
HTH
Rick Sawtell
MCT, MCSD, MCDBA|||BULK INSERT is a TSQL command, but it doesn't understand complex file formats. What you should be
looking at is "DTS". which is a COM objects for import/export. There are three tools in SLQ Server
that uses this COM object:
DTS Wizard. This is the easiest one to use. Start here.
Package Designer. Use this after you done a few wizards.
DTSRUN.EXE. Schedule a package created with any of above two.
--
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
<jacmcginty@.gmail.com> wrote in message
news:1116859509.543432.227390@.g14g2000cwa.googlegroups.com...
> Hi,
> I am new to SQL Server (version 8) and hope that someone can help me
> solve a problem.
> A colleague has set up a SQL Server database with several tables for
> which he has constructed multiple applications. As a result, their
> structure cannot be altered in any way.
> I have received updates for the data -- but the updates are in dbase
> files. I would like to insert these dbase files into the existing table
> structures (a simple thing to do in Foxpro), but I can't seem to figure
> out how to do it in SQL Server.
> Can someone point me in the right direction? (I've searched the
> archives and gotten a few hints about bulk insert but I'm missing some
> critical piece of the puzzle.)
> Many thanks,
> Jo
>|||Hi, and thank you. The DTS wizard did the trick for this particular
problem (although I see that bulk insert may be helpful in the future).
May I ask a couple of follow-up questions regarding DTS?
1. Is it possible, using the DTS wizard, to change a field name on the
fly?
For example, some of the data tables that I am updating include
underscores in the field names. I need to remove the underscores to
make the updates compatible with previous issues of the data. (I could
do it manually after the import is completed, but since I expect to
repeat this exercise frequently, it would be helpful if I could make
this change during the import.)
2. Once a DTS package has been saved, can it be edited?
For example, I realized after I had saved a DTS and run it that one of
my field types was incorrect, but I couldn't see how to edit the DTS.
(I ended up recreating the whole thing to fix the error.)
Many thanks,
Jo