Sunday, March 25, 2012

Archiving

What's the best way to archive tables with binary data? Flat files/bcp/DTS
may not work if need to import binary data(not a fixed length column) with
quotes etc. What I am thinking is to export to another database and back it
up/archive on a regular basis. That way it will be easy to restore when
needed. Any other options?
Thanks
BVR
Vyas has a great article on his web site about archiving.
http://vyaskn.tripod.com/
"Uhway" <vbhadharla@.sbcglobal.net> wrote in message
news:OwmojqdKFHA.1476@.TK2MSFTNGP09.phx.gbl...
> What's the best way to archive tables with binary data? Flat
files/bcp/DTS
> may not work if need to import binary data(not a fixed length column) with
> quotes etc. What I am thinking is to export to another database and back
it
> up/archive on a regular basis. That way it will be easy to restore when
> needed. Any other options?
> Thanks
> BVR
>
|||Here's the link: http://vyaskn.tripod.com/sql_archive_data.htm
HTH,
Vyas, MVP (SQL Server)
SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
"Uri Dimant" <urid@.iscar.co.il> wrote in message
news:Ogj%23SXfKFHA.3332@.TK2MSFTNGP15.phx.gbl...
Vyas has a great article on his web site about archiving.
http://vyaskn.tripod.com/
"Uhway" <vbhadharla@.sbcglobal.net> wrote in message
news:OwmojqdKFHA.1476@.TK2MSFTNGP09.phx.gbl...
> What's the best way to archive tables with binary data? Flat
files/bcp/DTS
> may not work if need to import binary data(not a fixed length column) with
> quotes etc. What I am thinking is to export to another database and back
it
> up/archive on a regular basis. That way it will be easy to restore when
> needed. Any other options?
> Thanks
> BVR
>
|||Narayanan,
I saw your Link, it looks good but i think it may or may not work if the
orders for 6 months is more than 100,000.
What happens if the data is more than 100,000 as we have in a real-time
system.
my solution was (lets take 6 months period - 100,000 records)
If you have to leave 10,000 on live production database and remove the rest
to history database.
This is how I did and its working great.
1. Check the record count of the database based on date.
2. Use Chunking method - means copy 1000-5000 every time depending on the
load from the live database to temp archive database on the live server then
call another str proc to copy data from temp archive to remote archive
database server.
(reason being if network goes down btw live server/remote server [reason
unknown] you can still have data in temp not lost and can come back redo the
data.)
3. once copied to remote archive server, now come back delete that data from
the live database by "delete from live where seqno in temp archive database"
and later delete from temp archive. (seqno is a primary key field)
I go beyond this, once all the data has been copied, i check for the first
and last copied to archive server exists meaning it has successfully copied.
I wrote a program to do this, the program just calls 3 stored procedures and
str proc does all the above function.
Too much isn't it.
Arun
"Narayana Vyas Kondreddi" wrote:

> Here's the link: http://vyaskn.tripod.com/sql_archive_data.htm
> --
> HTH,
> Vyas, MVP (SQL Server)
> SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
>
> "Uri Dimant" <urid@.iscar.co.il> wrote in message
> news:Ogj%23SXfKFHA.3332@.TK2MSFTNGP15.phx.gbl...
>
> Vyas has a great article on his web site about archiving.
> http://vyaskn.tripod.com/
>
>
> "Uhway" <vbhadharla@.sbcglobal.net> wrote in message
> news:OwmojqdKFHA.1476@.TK2MSFTNGP09.phx.gbl...
> files/bcp/DTS
> it
>
>

No comments:

Post a Comment