Friday, February 10, 2012

Before Insert

Hi,

Im migrating an interbase database across to sql server 2005 the problem I have is that it uses before triggers which allow data to be modified before the table is populated. So if a large value was inserted into a smaller data type it will scale the value down.

The idea is to not change the front end if at all possible is there a way that I can mimick this behaviour.

I've tried triggers but the insert fails before it gets to the trigger the same with using instead of triggers as I believe the insert table is identical to the base table and so this fails.

I've looked at rules but this just allows me to restrict the vales going in?.

Any ideas would be appreciated.

Many thanks.


Code Snippet

-- Rename your base table
Create table RenamedTable (
Srno int,
descr varchar(30)
)

-- Create a view with same name as your base table
alter view BaseTable
as
select
Srno, Descr = convert(varchar(50), Descr )
from
RenamedTable -- This is your Original/Renamed table

--Create an INSTEAD OF INSERT trigger on the view.
ALTER TRIGGER trI_BaseTable on BaseTable
INSTEAD OF INSERT
AS
BEGIN
set nocount on
INSERT INTO RenamedTable
SELECT Srno, left(Descr,30) FROM inserted
END
GO

insert into BaseTable select * from BaseTable
insert into BaseTable values (1, 'One')
insert into BaseTable values (2, 'Two123123123123123123123123123123123123123123123')
select * from RenamedTable.

|||Refer to Books Online, Topic: 'INSTEAD OF TRIGGERS'|||

SunnyD,

I'm sure there's more to the picture than meets the eye, but I have a couple of questions.

1. If a value is larger than the datatype of the field, how are you scaling it down?

2. When you scale it down, are you losing value or simply trimming the fat?

The reason I ask is that the frequency of scaling down may justify the means to utlimately increase the datatype size for the field.

Triggers are often a reasonable solution but we have to be careful that we don't abuse the intent.

Also, the filtering can be accomplished at the source instead of waiting to trim at the database level. I realize you don't want to change the front end, but this is where validation should occur. This helps eliminate the GIGO (garbage in, garbage out) potential.

In closing, although databases do have the functionality to clean house, it doesn't mean that we should neglect the programming practice of using validation on the front end. Remember, databases were designed as a storage facility, not a programming environment.

Just my twist on it,

Adamus

|||

Totally agree with everything said and given the choice I would rather change the front-end however that's not an option at the moment and my scope is to get the back-end working in an identical fashion to the way it works now. I've tried instead of triggers but again this does not appear to work see example below:

CREATETABLE BaseTable

(OrderKey intPRIMARYKEYIDENTITY(1,1),

Quantity smallintNOTNULL)

GO

--Create a view that contains all columns from the base table.

CREATEVIEW InsteadView

AS

SELECT OrderKey, Quantity

FROM BaseTable

GO

--Create an INSTEAD OF INSERT trigger on the view.

CREATETRIGGER InsteadTrigger on InsteadView

INSTEADOFINSERT

AS

BEGIN

--Build an INSERT statement

INSERTINTO BaseTable(Quantity)

SELECTCASEWHEN Quantity > 99999 THEN Quantity/10.0 ELSE Quantity END

FROM inserted

END

GO

INSERTINTO InsteadView (Quantity)SELECT 9999 --Works ok

INSERTINTO InsteadView (Quantity)SELECT 99999

Msg 220,Level 16,State 1, Line 1

Arithmetic overflow error for data typesmallint,value= 99999.

The statement has been terminated.

Any other thought guys?

|||

smallint is defined as a value between -32768 and 32767. So, 99999 definitely causes overflow.

Either change your datatype or change your case/when to trap the correct range.

e.g.

SELECT CASE WHEN Quantity > 32767 THEN Quantity/10.0 ELSE Quantity END

|||

If data type not changed changing case will not work! Im still getting overflow as inserted will be based on the base table and therefore will not hold a value greater than 32,767. i.e. Quantity column in inserted will not store a value greater than 32767.

|||

The VIEW has the SAME datatypes as the underlaying table.

You are attempting to INSERT a value larger than 32767 into the VIEW and it will FAIL since the datatype is smallint.

The only way that you will be able to accomplish this task is to create another table with a larger datatype, and use a trigger on that table to move the data to your original table.

In my opinion, a very bad kludge... (Change the original table's datatype and stop perverting the data.)

|||I believe you can create a view with different data type (length)

Create table BaseTable (Srno int, varchar(30))
go

Create view my View as
select
Srno, Descr = convert(varchar(50), Descr )
from
BaseTable
go

Now you can create a instead of trigger on this view...

|||

Bushan,

I'm not too sure that will work for an INSERT. Have you tried it and been successful?

|||

Code Snippet

-- Rename your base table
Create table RenamedTable (
Srno int,
descr varchar(30)
)

-- Create a view with same name as your base table
alter view BaseTable
as
select
Srno, Descr = convert(varchar(50), Descr )
from
RenamedTable -- This is your Original/Renamed table

--Create an INSTEAD OF INSERT trigger on the view.
CREATE TRIGGER trI_BaseTable on BaseTable
INSTEAD OF INSERT
AS
BEGIN
set nocount on
INSERT INTO RenamedTable
SELECT Srno, left(Descr,30) FROM inserted
END
GO

insert into BaseTable select * from BaseTable
insert into BaseTable values (1, 'One')
insert into BaseTable values (2, 'Two123123123123123123123123123123123123123123123')
select * from RenamedTable.

|||

I should have been a bit more explicit. The OP's table DDL indicates the presence of an IDENTITY field.

Because of the IDENTITY field, the code suggestion you provided doesn't seem to work as presented to solve the OP's issue.

I'm trying to understand if you have created a 'work-around' for handling the absence of the IDENTITY value in inserted. Even setting IDENTITY_INSERT ON in the Trigger doesn't seem to allow a way to get around the absence of the IDENTITY value in inserted. But I'm hoping you have found a way...

...Inquiring minds want to know...

|||

Code Snippet

Here you go...

-- Rename your base table
Create table NewDepartment (
DeptID int identity(1,1),
Dname varchar(6),
Location varchar(20)
)

-- Create a view with same name as your base table
create view Department
as
select
DeptID,
Dname = convert(varchar(50), Dname ),
Location
from
NewDepartment -- This is your Original/Renamed table

--Create an INSTEAD OF INSERT trigger on the view.
CREATE TRIGGER trI_Department on Department
INSTEAD OF INSERT
AS
BEGIN
set nocount on
INSERT INTO NewDepartment (Dname, Location)
SELECT
left(Dname,6), Location
FROM
inserted
END
GO

insert into Department values (1,'Sales', 'California')
insert into Department values (2,'Marketing', 'NewYork') -- 'ing' will be truncated from Marketing

|||

Here is a kludge that changes the underline datatype of the view to allow large value.

--Create a view that contains all columns from the base table.

CREATEVIEW InsteadView

AS

SELECT OrderKey, cast(Quantity as bigint) [Quantity]

FROM BaseTable

GO

--Create an INSTEAD OF INSERT trigger on the view.

CREATETRIGGER InsteadTrigger on InsteadView

INSTEADOFINSERT

AS

BEGIN

--Build an INSERT statement

INSERTINTO BaseTable(Quantity)

SELECTCASEWHEN Quantity > 32676 THEN Quantity/10.0 ELSE Quantity END

FROM inserted

END

GO

|||

Thanks for taking the time and effort.

It is so much more helpful when we provide the OP a response with a suggested solution that actually solves his/her problem.

No comments:

Post a Comment