Showing posts with label ssis. Show all posts
Showing posts with label ssis. Show all posts

Tuesday, March 27, 2012

Best SSIS scripting tome?

Is The Rational Guide to Scripting SQL Server 2005 Integration Services Beta Preview by Donald Farmer the best way to learn how to use scripting in SSIS as of late 2006? I'm not a .NET developer, although I come from a Java and C++ background.

I already own Professional SQL Server 2005 Integration Server, but that one doesn't cover scripting so much.

Thanks in advance.

Donald is a frequent poster on this site. However, this question might be hard for him (and me) to answer because of conflict of interest issues.

I work with Donald and I can assure you that he is quite knowledgeable in the uses, misuses and intricacies of SSIS.

As one of our primary points of contact with customers he also owns a wealth of experience in typical and interesting uses for scripting.

I hope this helps inform your decision.

|||

I've read that book.. It gives you a nice understanding of the ScriptTask and ScriptComponent ( http://msdn2.microsoft.com/fr-fr/library/ms345171.aspx ) and gives you some tips and advices throughout the book. It covers input/ output variables and how to access them in the ScriptTask etc... The book has some practical realworld scenarios at the end.

However the real power of SSIS is hidden in the CustomComponent, but that's not even mentioned - well atleast imho (it's a lot easier to debug CustomComponents contra ScriptComponent which are nearly impossible)

Overall I was very happy with the chapters, the writing is clear and easy to understand. And if you understand Java it should be easy to get started with.

BEST SSIS equivalent for DTS2000 CURSORS

Hi everyone,

Right now we are trying to replace all stored procedures by SSIS logic.

One of the stored procedures does the following,

1. For Every record in PSTREELEAF find the Hierarchy FROM
PSTREENODE and store them in local valriables.

First the stored procedure pulls all the data from the PSTREELEAF table based on join condition into local variables using cursor. Then for every record in the cursor it finds the corresponding records from PSTREENODE table. This is just a part of the stored procedure.

can anyone tell me how can i do this in SSIS? and also tell me which task would be a perfect replacement for CURSORS in SSIS?

Thanks,

Praveen

Hi,
Do not use cursors if you can avoid it and do not attempt to replicate cursors.
Use Set based logic or derived tables or subqueries or Update statements or affect values to columns using subqueries as columns (top 1 or distinct).

Please post some more explicit examples and we may suggest another approach.

you can create loops in SSSI but the point is to make sure you really need this in the first place.

Philippe|||

Hi ,

thanks Phillippe.... here is the code which i need to change it to SSIS logic.

DECLARE NodeCursor CURSOR LOCAL FOR SELECT DISTINCT Tree_node_num,Range_from,Range_to,EFFDT
FROM FINANCEODS.DBO.PSTREELEAF
WHERE TREE_NAME='ACCTROLLUP' AND EffDt>=@.effDt AND INACTIVE_DATE IS NULL

OPEN NodeCursor

/*********************************************************************************
fetch nodes one by one and find out the hierarchy of the nodes
*********************************************************************************/

FETCH NEXT FROM NodeCursor
INTO @.Tree_node_num,@.LeafFrom,@.LeafTo,@.effDt
SET @.Parent_node_num=@.Tree_node_num

IF @.@.ERROR <> 0
BEGIN
ROLLBACK TRANSACTION
SET @.p_errNum = 50505
SET @.p_errMsg = 'SPUpdateAccounts: Error Fetching Node Cursor.'
RETURN Exit Stored Procedure
END
WHILE @.@.FETCH_STATUS = 0
BEGIN
/*****************************************************
While Loop to find out all the Hierarchy information

******************************************************/
WHILE (@.Parent_node_num)<>0

BEGIN

/************************************************************************************
Find the Parent node and its level(hierarchy)
*************************************************************************************/
SELECT @.Parent_node_num=Parent_node_num, @.Tree_node_num=Tree_node_num,@.TreeLevel=TREE_LEVEL_NUM ,@.TreeNode=ISNULL(B.DESCR,A.TREE_NODE)
FROM FINANCEODS.DBO.PSTREENODE A
LEFT JOIN ( SELECT MAX(EFFDT) AS EFFDT,DESCR,TREE_NODE,INACTIVE_DATE FROM FINANCEODS.DBO.PS_TREE_NODE_TBL WHERE EFFDT<
=@.EFFDT GROUP BY DESCR,TREE_NODE,INACTIVE_DATE ) B ON A.TREE_NODE=B.TREE_NODE
WHERE
A.Tree_node_num=@.Tree_node_num
AND A.TREE_NAME='ACCTROLLUP' AND A.EFFDT=@.EFFDT AND A.INACTIVE_DATE IS NULL AND B.INACTIVE_DATE IS NULL
/************************************************************************************
Store the Hierarchy level information in local variables.
*************************************************************************************/
SELECT @.Source= (CASE @.TreeLevel WHEN 6 THEN @.TreeNode ELSE @.Source END),
@.Type= (CASE @.TreeLevel WHEN 5 THEN @.TreeNode ELSE @.Type END),
@.Element= (CASE @.TreeLevel WHEN 4 THEN @.TreeNode ELSE @.Element END),
@.Component= (CASE @.TreeLevel WHEN 3 THEN @.TreeNode ELSE @.Component END),
@.FinStatement= (CASE @.TreeLevel WHEN 2 THEN @.TreeNode ELSE @.FinStatement END)
SET @.Tree_node_num=@.Parent_node_num
END

Wednesday, March 7, 2012

Best place for job posting?

We are looking to place a job posting for a developer experienced in SSIS
and SSRS. Are there any recommendations for the best place to post the job
offer?
thank you,
Bob M..http://jobs.sqlservercentral.com/
"Bob Murdoch" <ram_re_move_5@.erols.com> wrote in message
news:eeG7TzXAIHA.4956@.TK2MSFTNGP06.phx.gbl...
> We are looking to place a job posting for a developer experienced in SSIS
> and SSRS. Are there any recommendations for the best place to post the
> job offer?
> thank you,
> Bob M..
>|||"Dean" <deanl144@.hotmail.com.nospam> wrote:..
> http://jobs.sqlservercentral.com/
>
Thank you for the reply Dean. It looks like there is a grand total of four
job postings on this site, which seems to indicate it may not be one of the
'best' places to post a job offer.
Am I reading the site correctly?
Bob M..

Best Oracle source provider?

Hello,
I am using a massive Oracle db in SSIS and am trying to figure out the best way to increase performance in moving data from this db to sql server. As far as Oracle sources, which seem to have the best performance?
Thanks,
AdrianWhat did you mean by the 'Oracle sources'? We tried using OLEDB for oracle and the performance is really poor.|||I've had that experience as well with OLEDB. What I meant by sources is either the .NET OracleClient, Microsoft OLEDB provider for Oracle, or Oracle provider for OLEDB. I believe these are the only 3 options to connect to Oracle. I can't even use the Oracle provider for OLEDB without numerous errors, and have found that the other 2 are quite slow as well. So I was just wondering what other people's experiences were, and if they found 1 to be superior over another. And maybe some input from the SSIS developers as to what their suggestions are.
Thanks,
Adrian|||The OLEDB providers for Oracle that are currently available aren't terribly fast. Have you considered replicating (a subset of) your data to SQL Server from Oracle before (or at the beginning of) your package execution? This topic in BOL is probably where to start: ms-help://MS.SQLCC.v9/MS.SQLSVR.v9.en/rpldata9/html/2e013259-0022-4897-a08d-5f8deb880fa8.htm|||Hello,

We have built a super fast connector for Oracle for SSIS. We have been able to achieve about 30 times performance improvements over the OLEDB Oracle provider. E.g 1 million rows, we loaded in about 2:20 mins. Let me know if you are interested in beta bits.

Regards
Sidharth
http://www.persistentsys.com
siddossy@.hotmail.com

Best Oracle source provider?

Hello,
I am using a massive Oracle db in SSIS and am trying to figure out the best way to increase performance in moving data from this db to sql server. As far as Oracle sources, which seem to have the best performance?
Thanks,
Adrian
What did you mean by the 'Oracle sources'? We tried using OLEDB for oracle and the performance is really poor.|||I've had that experience as well with OLEDB. What I meant by sources is either the .NET OracleClient, Microsoft OLEDB provider for Oracle, or Oracle provider for OLEDB. I believe these are the only 3 options to connect to Oracle. I can't even use the Oracle provider for OLEDB without numerous errors, and have found that the other 2 are quite slow as well. So I was just wondering what other people's experiences were, and if they found 1 to be superior over another. And maybe some input from the SSIS developers as to what their suggestions are.
Thanks,
Adrian
|||The OLEDB providers for Oracle that are currently available aren't terribly fast. Have you considered replicating (a subset of) your data to SQL Server from Oracle before (or at the beginning of) your package execution? This topic in BOL is probably where to start: ms-help://MS.SQLCC.v9/MS.SQLSVR.v9.en/rpldata9/html/2e013259-0022-4897-a08d-5f8deb880fa8.htm|||Hello,

We have built a super fast connector for Oracle for SSIS. We have been able to achieve about 30 times performance improvements over the OLEDB Oracle provider. E.g 1 million rows, we loaded in about 2:20 mins. Let me know if you are interested in beta bits.

Regards
Sidharth
http://www.persistentsys.com
siddossy@.hotmail.com

Best method or is it even worth the effort?

Hi,

I have a question regarding a project I am working on and I'm not sure what the best way to do it may be, I am new to SSIS and would like to gain some exposure using it, but don't know if this project is overkill or even possible using SSIS.

*** PROBLEM ****

Basically I have a large number of flat files each file contains different columns and data. What I want to do is import all of the data from the flat files into corresponding tables in SQL Server. The tables in SQL server are already defined with data structure to match each file. The first 10 characters of the flat file are also the same name as the SQL table. So it seems easy enough to just import flat file to table one by one, but I cannot figure out how to match the flat file structure to each table without setting up a task for each table (there are roughly 50-60 tables). Another reason I want to do it with SSIS is these flat files have to be imported on a weekly basis.

**********************

Is it in my best interest to just create a stored procedure that basically loops through the files and imports the data? Or do you think using SSIS is the best solution? I have looked at SSIS a bit and thought maybe I could use the bulkinsert task, but it appears it cannot dynamically change due to column specs. I figured out how to loop through the flat files using the foreach task, but I could only get it to import into one SQL table. I also started looking at the dataflow task and thought about creating a script task, but I'm not sure if that would work before I spend a lot of time trying to figure that out.

Any suggestions or thoughts are appreciated.

Thank you

Are you saying that there are many record layouts/formats in the same file? Or one file has only one format, but there are multiple formats and multiple files with those formats?

Either way, if you have 50 file formats, seems like you're going to have to configure 50 flat file source connections. There may be a good way with the script task, but that's not my strength. But I would choose SSIS over other method - at the least it sounds like you should look into the conditional split transformation, and variables. Let us know about the above question.

|||

To implement this without scripting (I'm under the impression that your 50 files / tables have different formats), you'd need to create 50 data flows.

It certainly could be implemented via a script, but I'm not sure what benefit you'd get from writing the script in SSIS versus just creating a VB or C# application.

|||

Not sure what it is you are doing, but it sounds EXACTLY like what I have recently done - except with 45 files/tables. I started to go down the route of individual DFs, but instead created individual packages for each of the files I have to process. Either way will work. Basically, we have a parent package that kicks off the children packages to process each file. This is done through iterating through our known files for each organization that will submit these 45 files.

This is working as designed and have not ran into any troubles. We are still working on getting some better error logging now. Please feel free to post if you have any questions.

Regards

|||

I was hoping to create something a little less cumbersome, although I think that will work and may be my last resort. I have something right now that seems to work somewhat I need to do further testing and development to see if it will meet my needs.

What I have done is create a simple foreach loop and inside the foreach loop I have a script task that takes the return value from the foreach loop which is the file name, and parses the string into values that I can pass into the bulk insert task.

In my script I set the Dts.Connections.Item("BulkFile").ConnectionString to the filename variable from the foreach loop and then I parse the string to get the name of my table and set a user defined variable I just called "TableName" with the parsed string value. I then open the Bulk Insert Task and set the DestinationTableName expression to my user defined variable "TableName".

My initial findings appear to work well, but so far I have only imported six of the table dynamically I still have 50+ to go. I have to create some stored procedures to clean out the tables before the bulk insert due to primary key constraints etc... but basically once a week I am flushing all of the data in these tables and will be replacing them with the data from the flat files.

Friday, February 24, 2012

Best approach for pushing records to MS Access

All,

I am new to DTS/SSIS and have a couple of questions about using it to solve a problem. We have an application running on SQL Server 2005 where status records are written to a status table. I need to be able to send those records over to a status table in a legacy application running on Access.

Originally, I thought about writing a custom c# stored proc and accessing Access from it and then someone pointed me to DTS/SSIS.

Is there a way to exectute the package based on a trigger event that a row was inserted or updated? If not and I take a scheduled approach (every 3 minutes, etc.) do I have to maintain a column for the records that get processed so they are not picked up again.

In general is using SSIS the approach to take? The overall business requirements are straight forward, but I am not sure if SSIS is overkill for this or not.

Thanks,

Steve

If I use an Execute SQL Task on the Control Flow, how do I use that resulting dataset as a Data Source on the Data Flow? I added a variable named 0 and type object, but I cannot figure out how to reference it on the Data Flow designer tab.

Best approach

I have what I feel like is a simple package I am working to create. I am teaching myself SSIS as I go along.

Source server SQL 2000 database allows NULL values in columns.

Destination Server also SQL 2000 but the database required a value in each column.

So I do a basic source select what I want. I next need to read the values and determine if null then insert a space, do some column matching and insert them into the destination sever.

I believe I should use a Derived Column and an expression ISNULL to accomplish what I want.

Maybe there is a better way. Suggestion and comment appreciated.

Ryan

Try using the T-SQL ISNULL() function in your initial query, replacing any null values with an empty string or whatever...

ISNULL(myField,'')

Sunday, February 19, 2012

Benchmark of SSIS

Hi,

Has anyone seen any benchmark results for SSIS vs Informatica?

Thanks,

David

Mariner

David,

there is document that compares SSIS and informatica here : http://www.microsoft.com/sql/technologies/integration/conchango.mspx

and some comments here:

http://blogs.conchango.com/jamiethomson/archive/2005/03/22/1172.aspx

Rafael Salas

|||Thanks, these are helpful, but are not an offical benchmark between the two products.|||

David Botz wrote:

Thanks, these are helpful, but are not an offical benchmark between the two products.

I haven't seen anything published. I think the SSIS team did some benchmarking internally though whether they would publish those results is a differernt matter. I don't know whether it was compared against other ETL tools either

-Jamie

|||My thoughts are that I wouldn't expect anything to be published because I would guess a polished Informatica tool would kick the pants off of a first-generation Integration Services build.

I would also expect that the next major release of SSIS to be a real threat once they apply their learnings from our feedback and real-world usage.|||

Phil Brammer wrote:

My thoughts are that I wouldn't expect anything to be published because I would guess a polished Informatica tool would kick the pants off of a first-generation Integration Services build.

I'm not so sure. Yes INFA have had many more years of doing this but that means that their core engine was built years ago (probably) and I would expect that since then they have tweaked it rather than re-engineered it.

SSIS has been built from the ground up on newer technology (hardware & software) and that cannot be dismissed.

So INFAs maturity/SSISs immaturity could be both an advantage and a disadvantage.

And besides, when it comes down to sheer greasy fast speed at some point you come up against physical limitations so eeking out extra nanoseconds of performance is neither here not there regardless of which you are using.

Phil Brammer wrote:


I would also expect that the next major release of SSIS to be a real threat once they apply their learnings from our feedback and real-world usage.

Ooo that sounds like a challenge Phil

My advice is not to expect too much from the next release. The SQL Server team as a whole have realised the folly of a massive release that took 5 years to build and will be progressing on shorter release tiemframes. Inevitably this means that they won't have time to get everything in there that they would like. On the plus side we get to play with the new bits alot sooner. I also know some of what is coming (though I'm sworn to secrecy). There's stuff that will delight I'm sure whilst at the same time disappointments will be there for stuff that doesn't make it in.

Having said that, I don't think I'll get into too much trouble by saying that you should expect the Script Task/Component to change a whole lot for the better in SSIS 2007

-Jamie

|||Okay, well, the one thing SSIS has going for it is its bias towards using SQL Server databases, and rightfully so. They may have Informatica there, however as of yet, SSIS is not a competitor when using other database platform destinations. So, room to grow if they want.

Jamie, I hear ya though! My comments were based on years of experience with first-gen products, MS software notwithstanding.|||

Phil Brammer wrote:

Okay, well, the one thing SSIS has going for it is its bias towards using SQL Server databases, and rightfully so. They may have Informatica there, however as of yet, SSIS is not a competitor when using other database platform destinations. So, room to grow if they want.

very very good point. its easy to forget none-SQL Server targets.

-Jamie

Thursday, February 16, 2012

behavior of ssis packages running under SSMS manually vs. job agent automatically

I have a ssis package that has multiple large lookups without memory restriction. When running the package manually from SSMS on the same server it runs on when running automatically under the job agent, the package errors out when the server memory gets depleted by the loading of the large lookup reference data. One of the messages I get is

"An out-of-memory condition prevented the creation of the buffer object. "

Anyway, the package runs successfully when it runs automatically under the job agent.

I was curious as to why the above happens. Is that a bug or is the run time behavior different under these 2 environments by design.

js40

If you are running through SSMS, you are likley executing the package via your machine, not the server. Do you have the full SSIS client installed? (Do you have a SQL Server Integration Service service when looking at "Services" under the control panel?)|||

No, i ran the package on the same server as where it runs automatically under sql agent. I started it manually from SMSS under remote desktop. The server is sql server 2005 (64 bit) with SP1 and 8gig memory.

Actually, several times it crashed SMSS and the last thing I saw just before the SMSS window just plain disappeared was some errors related to memory.

Below are different errors i got when i ran this package serveral times.

1) The Data Flow task failed to create a required thread and cannot begin running. The usually occurs when there is an out-of-memory state.

2)The system reports 99 percent memory load. There are 8588873728 bytes of physical memory with 67133440 bytes free. There are 2147352576 bytes of virtual memory with 27369472 bytes free. The paging file has 12954959872 bytes with 4205600768 bytes free.

3)A buffer failed while allocating 2048 bytes.

4)An out-of-memory condition prevented the creation of the buffer object.

js40

Behavior of SSIS lookup transform on full cache setting with low computer memory

I would like to know what happens when a very large reference data set for a lookup transform with full caching enabled is getting loaded during package execution and the computer memory runs out or is very low.

Does SSIS

a) give an out of memory error of some sort

b) resort to a no caching or partial caching mode

c) maintain the full caching mode but will switch to using the paging file(virtual memory).

I think it will resort to using the page file in which case the benefits of in memory lookups are lost and performance would suffer. If I cannot upgrade the memory or shrink the reference set somehow, i should switch that lookup task to use partial caching or no caching with an indexed lookup table. Would this make sense?

It won't do B unless you explicitly set it. I think it does C as part of the standard Windows memory management, but you might see A too if you are dealing with large data sets.

Your approach would make sense. I'd start by making sure that you are only caching the bare minimum reference set needed, though.

Beginning SSIS - Sharepoint and Excel

Hi all, I have never used SSIS before, but am looking to use it in this aspect:

1) A user uploads an Excel file into Sharepoint. This will be through a Document Library in a Sharepoint page.

2) When this file is uploaded, I'd like SSIS to notice there is a new file and process it - it will pull information from the Excel file, put it into a database, and - if this is possible - delete the new file.

3) This is iffy. Can SSIS then generate an InfoPath document from the information stored in the database? If not, I can just have a InfoPath query form.

I'd just like to know if this is possible. If you have any helpful links, please let me know - I would greatly appreciate it!

Thanks,

James

James,

2) SSIS can do all of this. You can use the FileWatcher task as provided on SQLIS.com. Deleting the file is easy.

3) No idea. If there is an API for InfoPath then it should be possible.

-Jamie

|||Thanks for the reply, Jamie. I'll take a look at that.|||

Alright, well I've taken a look and I'm having some issues -

In Control Flow, I have the file watcher watching a directory for any new Excel files and the output of the filewatcher is User::NewExcelFile, which I understand will be a string with the information of the type "file.xls".

I also have an Excel connection manager which points to the format of the Excel file that will be used. Then I have a SQL server destination that is mapped to the database the information will be put to. How does the File Watcher control flow task pass the information of the filename to the Excel/SQL data flow tasks? They don't allow variable inputs.

Thanks,

James

|||

The Excel source component uses an Excel Connection Manager which has a property called ExcelFilePath that contains the name and location of the file.

You can set this property using a property expression. (Select the connection manager, press F4, expand 'Expressions' in the properties pane). Set it to the value returned by the File Watcher.

-Jamie

|||

Thanks Jamie, that works - I assume as File watcher notices a new file, it triggers the following control/data flow.

Is there a way to set the Excel file's name to be something dynamic? For example, to watch for an Excel file of any name, not just the name of the Excel file specified in the Excel connection manager? File watcher appears to only watch a path, which it can pass to the connection manager when triggered by a new file upload, but the Excel Connection Manager needs a static Excel name.

And what is the task for deleting the Excel upon completion of the service?

Thanks for your time! I am new to SSIS and coming to grips with it :)

-James

EDIT:

Sorry, let me rephrase:

I set the File Watcher Output Variable Name to "User::NewExcelFile" - I assume that this contains either the path to the containing folder OR the containing folder + the name of the file.

If it is the path only, then how can I set the Excel Connection manager to get the name of the file as well?

And if it is the path and name, how do I parse the @.User::NewExcelFile to put the file path in ExcelFilePath and the name in Name?

Also, when the ExcelFilePath is set to @.User::NewExcelFile, I cannot set and save the file path in the Excel Connection Manager. It clears the field - this does not let me map the Excel field to the Sql destination.

|||

James,

The output variable will contain the path and the name of the file.

You need to put the whole thing (the path and the name) into ExcelFilePath property. So there's no need to parse it. The Name property is just the name of the connection manager. Its irrelevant.

Regarding the last point, the reason the field is cleared is because there is nothing in @.User::NewExcelFile. Put a path in that variable to an excel file that has the same metadata as the file you will be processing at runtime. This value is only used at design-time as it will be overwritten at runtime by the FileWatcher Task.

-Jamie

|||

I used file system watcher to read excel on my pc it worked fine, but when I tried to read the excel from SharePoint it did't work. The FileWatcher box showing the yellow color for long time that I had to stop the ssis.

So my question what is the cause of this. Do i need to set something or am i missing something? Please help.

Beginning SSIS - Sharepoint and Excel

Hi all, I have never used SSIS before, but am looking to use it in this aspect:

1) A user uploads an Excel file into Sharepoint. This will be through a Document Library in a Sharepoint page.

2) When this file is uploaded, I'd like SSIS to notice there is a new file and process it - it will pull information from the Excel file, put it into a database, and - if this is possible - delete the new file.

3) This is iffy. Can SSIS then generate an InfoPath document from the information stored in the database? If not, I can just have a InfoPath query form.

I'd just like to know if this is possible. If you have any helpful links, please let me know - I would greatly appreciate it!

Thanks,

James

James,

2) SSIS can do all of this. You can use the FileWatcher task as provided on SQLIS.com. Deleting the file is easy.

3) No idea. If there is an API for InfoPath then it should be possible.

-Jamie

|||Thanks for the reply, Jamie. I'll take a look at that.|||

Alright, well I've taken a look and I'm having some issues -

In Control Flow, I have the file watcher watching a directory for any new Excel files and the output of the filewatcher is User::NewExcelFile, which I understand will be a string with the information of the type "file.xls".

I also have an Excel connection manager which points to the format of the Excel file that will be used. Then I have a SQL server destination that is mapped to the database the information will be put to. How does the File Watcher control flow task pass the information of the filename to the Excel/SQL data flow tasks? They don't allow variable inputs.

Thanks,

James

|||

The Excel source component uses an Excel Connection Manager which has a property called ExcelFilePath that contains the name and location of the file.

You can set this property using a property expression. (Select the connection manager, press F4, expand 'Expressions' in the properties pane). Set it to the value returned by the File Watcher.

-Jamie

|||

Thanks Jamie, that works - I assume as File watcher notices a new file, it triggers the following control/data flow.

Is there a way to set the Excel file's name to be something dynamic? For example, to watch for an Excel file of any name, not just the name of the Excel file specified in the Excel connection manager? File watcher appears to only watch a path, which it can pass to the connection manager when triggered by a new file upload, but the Excel Connection Manager needs a static Excel name.

And what is the task for deleting the Excel upon completion of the service?

Thanks for your time! I am new to SSIS and coming to grips with it :)

-James

EDIT:

Sorry, let me rephrase:

I set the File Watcher Output Variable Name to "User::NewExcelFile" - I assume that this contains either the path to the containing folder OR the containing folder + the name of the file.

If it is the path only, then how can I set the Excel Connection manager to get the name of the file as well?

And if it is the path and name, how do I parse the @.User::NewExcelFile to put the file path in ExcelFilePath and the name in Name?

Also, when the ExcelFilePath is set to @.User::NewExcelFile, I cannot set and save the file path in the Excel Connection Manager. It clears the field - this does not let me map the Excel field to the Sql destination.

|||

James,

The output variable will contain the path and the name of the file.

You need to put the whole thing (the path and the name) into ExcelFilePath property. So there's no need to parse it. The Name property is just the name of the connection manager. Its irrelevant.

Regarding the last point, the reason the field is cleared is because there is nothing in @.User::NewExcelFile. Put a path in that variable to an excel file that has the same metadata as the file you will be processing at runtime. This value is only used at design-time as it will be overwritten at runtime by the FileWatcher Task.

-Jamie

|||

I used file system watcher to read excel on my pc it worked fine, but when I tried to read the excel from SharePoint it did't work. The FileWatcher box showing the yellow color for long time that I had to stop the ssis.

So my question what is the cause of this. Do i need to set something or am i missing something? Please help.

Monday, February 13, 2012

beginner question

Hi, folks:
I am looking into using SSIS to create a OLAP database. How easy it is to write a package to re-pull some fact data from the source databae base on some flags. What happend is we are planning to roll up some minute by minute data into hourly averages using SSIS but user is allowed to modified the minute by minute data (maximum is 90 days). Say if we set up some flags, and the package comes in every hour to check for these flags and only re-ETL the changed one. Can I be done or is there better way to do it.SSIS can push data straight into an OLAP cube from the pipeline. I'm not sure about updating values that are already there though (if that is what you are asking).

-Jamie|||Yes, that's updating the OLAP values is what I am asking. We do gas-flow calculation. For each well, we insert minute-by-minute data. Out of a thousand wells, only 10 wells will have the minute by minute data changed, and all I plan to do is to flag these 10 wells and re-ETL them.