Showing posts with label flat. Show all posts
Showing posts with label flat. Show all posts

Tuesday, March 27, 2012

Archiving

What's the best way to archive tables with binary data? Flat files/bcp/DTS
may not work if need to import binary data(not a fixed length column) with
quotes etc. What I am thinking is to export to another database and back it
up/archive on a regular basis. That way it will be easy to restore when
needed. Any other options?
Thanks
BVR
Why not just back up the database itself?
One thing you could do is put the tables w/ binary data on their own
filegroup -- then just back up that filegroup more regularly -- if that's
what you need?
Adam Machanic
SQL Server MVP
http://www.sqljunkies.com/weblog/amachanic
"Uhway" <vbhadharla@.sbcglobal.net> wrote in message
news:eMjAtpdKFHA.3552@.TK2MSFTNGP12.phx.gbl...
> What's the best way to archive tables with binary data? Flat
files/bcp/DTS
> may not work if need to import binary data(not a fixed length column) with
> quotes etc. What I am thinking is to export to another database and back
it
> up/archive on a regular basis. That way it will be easy to restore when
> needed. Any other options?
> Thanks
> BVR
>

Archiving

What's the best way to archive tables with binary data? Flat files/bcp/DTS
may not work if need to import binary data(not a fixed length column) with
quotes etc. What I am thinking is to export to another database and back it
up/archive on a regular basis. That way it will be easy to restore when
needed. Any other options?
Thanks
BVRVyas has a great article on his web site about archiving.
http://vyaskn.tripod.com/
"Uhway" <vbhadharla@.sbcglobal.net> wrote in message
news:OwmojqdKFHA.1476@.TK2MSFTNGP09.phx.gbl...
> What's the best way to archive tables with binary data? Flat
files/bcp/DTS
> may not work if need to import binary data(not a fixed length column) with
> quotes etc. What I am thinking is to export to another database and back
it
> up/archive on a regular basis. That way it will be easy to restore when
> needed. Any other options?
> Thanks
> BVR
>|||Here's the link: http://vyaskn.tripod.com/sql_archive_data.htm
--
HTH,
Vyas, MVP (SQL Server)
SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
"Uri Dimant" <urid@.iscar.co.il> wrote in message
news:Ogj%23SXfKFHA.3332@.TK2MSFTNGP15.phx.gbl...
Vyas has a great article on his web site about archiving.
http://vyaskn.tripod.com/
"Uhway" <vbhadharla@.sbcglobal.net> wrote in message
news:OwmojqdKFHA.1476@.TK2MSFTNGP09.phx.gbl...
> What's the best way to archive tables with binary data? Flat
files/bcp/DTS
> may not work if need to import binary data(not a fixed length column) with
> quotes etc. What I am thinking is to export to another database and back
it
> up/archive on a regular basis. That way it will be easy to restore when
> needed. Any other options?
> Thanks
> BVR
>|||Narayanan,
I saw your Link, it looks good but i think it may or may not work if the
orders for 6 months is more than 100,000.
What happens if the data is more than 100,000 as we have in a real-time
system.
my solution was (lets take 6 months period - 100,000 records)
If you have to leave 10,000 on live production database and remove the rest
to history database.
This is how I did and its working great.
1. Check the record count of the database based on date.
2. Use Chunking method - means copy 1000-5000 every time depending on the
load from the live database to temp archive database on the live server then
call another str proc to copy data from temp archive to remote archive
database server.
(reason being if network goes down btw live server/remote server [reason
unknown] you can still have data in temp not lost and can come back redo the
data.)
3. once copied to remote archive server, now come back delete that data from
the live database by "delete from live where seqno in temp archive database"
and later delete from temp archive. (seqno is a primary key field)
I go beyond this, once all the data has been copied, i check for the first
and last copied to archive server exists meaning it has successfully copied.
I wrote a program to do this, the program just calls 3 stored procedures and
str proc does all the above function.
Too much isn't it.
Arun
"Narayana Vyas Kondreddi" wrote:
> Here's the link: http://vyaskn.tripod.com/sql_archive_data.htm
> --
> HTH,
> Vyas, MVP (SQL Server)
> SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
>
> "Uri Dimant" <urid@.iscar.co.il> wrote in message
> news:Ogj%23SXfKFHA.3332@.TK2MSFTNGP15.phx.gbl...
>
> Vyas has a great article on his web site about archiving.
> http://vyaskn.tripod.com/
>
>
> "Uhway" <vbhadharla@.sbcglobal.net> wrote in message
> news:OwmojqdKFHA.1476@.TK2MSFTNGP09.phx.gbl...
> > What's the best way to archive tables with binary data? Flat
> files/bcp/DTS
> > may not work if need to import binary data(not a fixed length column) with
> > quotes etc. What I am thinking is to export to another database and back
> it
> > up/archive on a regular basis. That way it will be easy to restore when
> > needed. Any other options?
> >
> > Thanks
> > BVR
> >
> >
>
>

Sunday, March 25, 2012

Archiving

What's the best way to archive tables with binary data? Flat files/bcp/DTS
may not work if need to import binary data(not a fixed length column) with
quotes etc. What I am thinking is to export to another database and back it
up/archive on a regular basis. That way it will be easy to restore when
needed. Any other options?
Thanks
BVR
Vyas has a great article on his web site about archiving.
http://vyaskn.tripod.com/
"Uhway" <vbhadharla@.sbcglobal.net> wrote in message
news:OwmojqdKFHA.1476@.TK2MSFTNGP09.phx.gbl...
> What's the best way to archive tables with binary data? Flat
files/bcp/DTS
> may not work if need to import binary data(not a fixed length column) with
> quotes etc. What I am thinking is to export to another database and back
it
> up/archive on a regular basis. That way it will be easy to restore when
> needed. Any other options?
> Thanks
> BVR
>
|||Here's the link: http://vyaskn.tripod.com/sql_archive_data.htm
HTH,
Vyas, MVP (SQL Server)
SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
"Uri Dimant" <urid@.iscar.co.il> wrote in message
news:Ogj%23SXfKFHA.3332@.TK2MSFTNGP15.phx.gbl...
Vyas has a great article on his web site about archiving.
http://vyaskn.tripod.com/
"Uhway" <vbhadharla@.sbcglobal.net> wrote in message
news:OwmojqdKFHA.1476@.TK2MSFTNGP09.phx.gbl...
> What's the best way to archive tables with binary data? Flat
files/bcp/DTS
> may not work if need to import binary data(not a fixed length column) with
> quotes etc. What I am thinking is to export to another database and back
it
> up/archive on a regular basis. That way it will be easy to restore when
> needed. Any other options?
> Thanks
> BVR
>
|||Narayanan,
I saw your Link, it looks good but i think it may or may not work if the
orders for 6 months is more than 100,000.
What happens if the data is more than 100,000 as we have in a real-time
system.
my solution was (lets take 6 months period - 100,000 records)
If you have to leave 10,000 on live production database and remove the rest
to history database.
This is how I did and its working great.
1. Check the record count of the database based on date.
2. Use Chunking method - means copy 1000-5000 every time depending on the
load from the live database to temp archive database on the live server then
call another str proc to copy data from temp archive to remote archive
database server.
(reason being if network goes down btw live server/remote server [reason
unknown] you can still have data in temp not lost and can come back redo the
data.)
3. once copied to remote archive server, now come back delete that data from
the live database by "delete from live where seqno in temp archive database"
and later delete from temp archive. (seqno is a primary key field)
I go beyond this, once all the data has been copied, i check for the first
and last copied to archive server exists meaning it has successfully copied.
I wrote a program to do this, the program just calls 3 stored procedures and
str proc does all the above function.
Too much isn't it.
Arun
"Narayana Vyas Kondreddi" wrote:

> Here's the link: http://vyaskn.tripod.com/sql_archive_data.htm
> --
> HTH,
> Vyas, MVP (SQL Server)
> SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
>
> "Uri Dimant" <urid@.iscar.co.il> wrote in message
> news:Ogj%23SXfKFHA.3332@.TK2MSFTNGP15.phx.gbl...
>
> Vyas has a great article on his web site about archiving.
> http://vyaskn.tripod.com/
>
>
> "Uhway" <vbhadharla@.sbcglobal.net> wrote in message
> news:OwmojqdKFHA.1476@.TK2MSFTNGP09.phx.gbl...
> files/bcp/DTS
> it
>
>

Archiving

What's the best way to archive tables with binary data? Flat files/bcp/DTS
may not work if need to import binary data(not a fixed length column) with
quotes etc. What I am thinking is to export to another database and back it
up/archive on a regular basis. That way it will be easy to restore when
needed. Any other options?
Thanks
BVRWhy not just back up the database itself?
One thing you could do is put the tables w/ binary data on their own
filegroup -- then just back up that filegroup more regularly -- if that's
what you need?
Adam Machanic
SQL Server MVP
http://www.sqljunkies.com/weblog/amachanic
--
"Uhway" <vbhadharla@.sbcglobal.net> wrote in message
news:eMjAtpdKFHA.3552@.TK2MSFTNGP12.phx.gbl...
> What's the best way to archive tables with binary data? Flat
files/bcp/DTS
> may not work if need to import binary data(not a fixed length column) with
> quotes etc. What I am thinking is to export to another database and back
it
> up/archive on a regular basis. That way it will be easy to restore when
> needed. Any other options?
> Thanks
> BVR
>sql

Archiving

What's the best way to archive tables with binary data? Flat files/bcp/DTS
may not work if need to import binary data(not a fixed length column) with
quotes etc. What I am thinking is to export to another database and back it
up/archive on a regular basis. That way it will be easy to restore when
needed. Any other options?
Thanks
BVRVyas has a great article on his web site about archiving.
http://vyaskn.tripod.com/
"Uhway" <vbhadharla@.sbcglobal.net> wrote in message
news:OwmojqdKFHA.1476@.TK2MSFTNGP09.phx.gbl...
> What's the best way to archive tables with binary data? Flat
files/bcp/DTS
> may not work if need to import binary data(not a fixed length column) with
> quotes etc. What I am thinking is to export to another database and back
it
> up/archive on a regular basis. That way it will be easy to restore when
> needed. Any other options?
> Thanks
> BVR
>|||Here's the link: http://vyaskn.tripod.com/sql_archive_data.htm
--
HTH,
Vyas, MVP (SQL Server)
SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
"Uri Dimant" <urid@.iscar.co.il> wrote in message
news:Ogj%23SXfKFHA.3332@.TK2MSFTNGP15.phx.gbl...
Vyas has a great article on his web site about archiving.
http://vyaskn.tripod.com/
"Uhway" <vbhadharla@.sbcglobal.net> wrote in message
news:OwmojqdKFHA.1476@.TK2MSFTNGP09.phx.gbl...
> What's the best way to archive tables with binary data? Flat
files/bcp/DTS
> may not work if need to import binary data(not a fixed length column) with
> quotes etc. What I am thinking is to export to another database and back
it
> up/archive on a regular basis. That way it will be easy to restore when
> needed. Any other options?
> Thanks
> BVR
>|||Narayanan,
I saw your Link, it looks good but i think it may or may not work if the
orders for 6 months is more than 100,000.
What happens if the data is more than 100,000 as we have in a real-time
system.
my solution was (lets take 6 months period - 100,000 records)
If you have to leave 10,000 on live production database and remove the rest
to history database.
This is how I did and its working great.
1. Check the record count of the database based on date.
2. Use Chunking method - means copy 1000-5000 every time depending on the
load from the live database to temp archive database on the live server then
call another str proc to copy data from temp archive to remote archive
database server.
(reason being if network goes down btw live server/remote server [reason
unknown] you can still have data in temp not lost and can come back redo the
data.)
3. once copied to remote archive server, now come back delete that data from
the live database by "delete from live where seqno in temp archive database"
and later delete from temp archive. (seqno is a primary key field)
I go beyond this, once all the data has been copied, i check for the first
and last copied to archive server exists meaning it has successfully copied.
I wrote a program to do this, the program just calls 3 stored procedures and
str proc does all the above function.
Too much isn't it.
Arun
"Narayana Vyas Kondreddi" wrote:

> Here's the link: http://vyaskn.tripod.com/sql_archive_data.htm
> --
> HTH,
> Vyas, MVP (SQL Server)
> SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
>
> "Uri Dimant" <urid@.iscar.co.il> wrote in message
> news:Ogj%23SXfKFHA.3332@.TK2MSFTNGP15.phx.gbl...
>
> Vyas has a great article on his web site about archiving.
> http://vyaskn.tripod.com/
>
>
> "Uhway" <vbhadharla@.sbcglobal.net> wrote in message
> news:OwmojqdKFHA.1476@.TK2MSFTNGP09.phx.gbl...
> files/bcp/DTS
> it
>
>

Tuesday, March 20, 2012

Approach Help to load Data from Flatfiles into relational table where Data is coming as spaces i

Hi ,

My Input is a flat file source and it has spaces in few columns in the data . These columns are linked to another table as a foreign key and when i try loading them in a relational structure Foreigh key violation is occuring , is there a standard method to replace these spaces .

what approach should i take so that data gets loaded in a relational structure.

for example

Name Age Salary Address

dsds 23 fghghgh

Salary description level

2345 nnncncn 4

here salary is used in this example , the datatype is char in real scenario

what approach should i take to load the data in with cleansing the spaces in ssis

Assumption: You don't have any control over the structure of the destination tables, but you can add records.

To maintain your foreign key constraint, you will need to create a "salary" record that you can use where salary is not defined. EG:

Salary Description Level

9999 Not Assigned 1

Then you can use a Derived Column expression to create a new column in your dataset that uses an If expression and then write that derived column to your destination table

Like this:

Code Snippet

[Salary]=="" ? 9999 : [Salary]

This is not an optimal solution (even though it is very practical.) Ideally you should restructure your tables so:

Assumption: You can change the structure of your destination tables

In this case, you need to take the Salary column out of your destination table, and put it into another table. The relationship is that employees can have zero or one salary code. So you can create a table relating employees to salary codes with two columns (EmployeeNumber, SalaryCode) and a primary key on EmployeeCode.

(This is a drastic simplification, of course. In a real-world scenario you would probably want to track historical information too.)

|||

Thanks for the Quick reply i tried doing as you said using derived transformation and using replace function but it doesnot seem to capture the space as value in the replace expression in ssis.

Replace(salary," ",9999) it does not work it says that there might be a possibility of divide by zero error. what expression should i use here to replace the values

|||

first of all, since this is a char value, you should put quotes around the 9999.

Seond, I didn't say to use the REPLACE function, though I guess that would work, but you run the risk of having a valid id messed up by replacing a leading or trailing space with a 9999.

Also, if your Salary code was " " (eight spaces) then the REPLACE function will put in:

"99999999999999999999999999999999" which migh be too big for your destination field.

So, try using TRIM first to get ris of spaces, then ?: to replace blank with 9999.

Put this into the Expression column of Derived Value step:

Code Snippet

TRIM([SALARY])=="" ? "9999" : TRIM([Salary])

sql

Thursday, February 16, 2012

Append Data to a Flat file

Hello,

I was wondering if there was a way for me to append data to a flat file. The reason why i ask this is because i need to create a header for the report that i am exporting.

The way i imagined this working would to be create a dts that would export the header information to a flat file and the create another dts to export the report data and appensd it to the same file that the header dts created. This might not be the correct approach so i was hoping i could get some guidance of how i can accomplish this.

I am using SQL Server 2000.

Thank you!Look up "BCP" in Books on Line (BoL) to be able to do this in SQL.|||Thank you for the tip. I will look this up and post the findings.|||Unless they have changed things, BCP is not able to append to a data file. OSQL, however, can be made to append, given the proper commandline syntax (>>). At that point, however, the whole report may as well be done with a single script in OSQL. Why is Reporting Services not being considered here?|||Right. I just tryed the bcp command and it will not append to the file. I overrights it.

I am not using reporting services because it is not installed on the box.

Can you format the file that osql creates with pipes?

append current date to file name

Is there an easy way to append the current date to a file when using a flat file destination?

Easy when you know how

Hopefully this post will give you some pointers:

Dynamically set a logfile name
(http://blogs.conchango.com/jamiethomson/archive/2006/10/05/SSIS-Nugget_3A00_-Dynamically-set-a-logfile-name.aspx)

-Jamie

|||That was easy. Thanks.|||Perfect timing. Thanks for your post.