Showing posts with label import. Show all posts
Showing posts with label import. Show all posts

Tuesday, March 27, 2012

Archiving

What's the best way to archive tables with binary data? Flat files/bcp/DTS
may not work if need to import binary data(not a fixed length column) with
quotes etc. What I am thinking is to export to another database and back it
up/archive on a regular basis. That way it will be easy to restore when
needed. Any other options?
Thanks
BVR
Why not just back up the database itself?
One thing you could do is put the tables w/ binary data on their own
filegroup -- then just back up that filegroup more regularly -- if that's
what you need?
Adam Machanic
SQL Server MVP
http://www.sqljunkies.com/weblog/amachanic
"Uhway" <vbhadharla@.sbcglobal.net> wrote in message
news:eMjAtpdKFHA.3552@.TK2MSFTNGP12.phx.gbl...
> What's the best way to archive tables with binary data? Flat
files/bcp/DTS
> may not work if need to import binary data(not a fixed length column) with
> quotes etc. What I am thinking is to export to another database and back
it
> up/archive on a regular basis. That way it will be easy to restore when
> needed. Any other options?
> Thanks
> BVR
>

Archiving

What's the best way to archive tables with binary data? Flat files/bcp/DTS
may not work if need to import binary data(not a fixed length column) with
quotes etc. What I am thinking is to export to another database and back it
up/archive on a regular basis. That way it will be easy to restore when
needed. Any other options?
Thanks
BVRVyas has a great article on his web site about archiving.
http://vyaskn.tripod.com/
"Uhway" <vbhadharla@.sbcglobal.net> wrote in message
news:OwmojqdKFHA.1476@.TK2MSFTNGP09.phx.gbl...
> What's the best way to archive tables with binary data? Flat
files/bcp/DTS
> may not work if need to import binary data(not a fixed length column) with
> quotes etc. What I am thinking is to export to another database and back
it
> up/archive on a regular basis. That way it will be easy to restore when
> needed. Any other options?
> Thanks
> BVR
>|||Here's the link: http://vyaskn.tripod.com/sql_archive_data.htm
--
HTH,
Vyas, MVP (SQL Server)
SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
"Uri Dimant" <urid@.iscar.co.il> wrote in message
news:Ogj%23SXfKFHA.3332@.TK2MSFTNGP15.phx.gbl...
Vyas has a great article on his web site about archiving.
http://vyaskn.tripod.com/
"Uhway" <vbhadharla@.sbcglobal.net> wrote in message
news:OwmojqdKFHA.1476@.TK2MSFTNGP09.phx.gbl...
> What's the best way to archive tables with binary data? Flat
files/bcp/DTS
> may not work if need to import binary data(not a fixed length column) with
> quotes etc. What I am thinking is to export to another database and back
it
> up/archive on a regular basis. That way it will be easy to restore when
> needed. Any other options?
> Thanks
> BVR
>|||Narayanan,
I saw your Link, it looks good but i think it may or may not work if the
orders for 6 months is more than 100,000.
What happens if the data is more than 100,000 as we have in a real-time
system.
my solution was (lets take 6 months period - 100,000 records)
If you have to leave 10,000 on live production database and remove the rest
to history database.
This is how I did and its working great.
1. Check the record count of the database based on date.
2. Use Chunking method - means copy 1000-5000 every time depending on the
load from the live database to temp archive database on the live server then
call another str proc to copy data from temp archive to remote archive
database server.
(reason being if network goes down btw live server/remote server [reason
unknown] you can still have data in temp not lost and can come back redo the
data.)
3. once copied to remote archive server, now come back delete that data from
the live database by "delete from live where seqno in temp archive database"
and later delete from temp archive. (seqno is a primary key field)
I go beyond this, once all the data has been copied, i check for the first
and last copied to archive server exists meaning it has successfully copied.
I wrote a program to do this, the program just calls 3 stored procedures and
str proc does all the above function.
Too much isn't it.
Arun
"Narayana Vyas Kondreddi" wrote:
> Here's the link: http://vyaskn.tripod.com/sql_archive_data.htm
> --
> HTH,
> Vyas, MVP (SQL Server)
> SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
>
> "Uri Dimant" <urid@.iscar.co.il> wrote in message
> news:Ogj%23SXfKFHA.3332@.TK2MSFTNGP15.phx.gbl...
>
> Vyas has a great article on his web site about archiving.
> http://vyaskn.tripod.com/
>
>
> "Uhway" <vbhadharla@.sbcglobal.net> wrote in message
> news:OwmojqdKFHA.1476@.TK2MSFTNGP09.phx.gbl...
> > What's the best way to archive tables with binary data? Flat
> files/bcp/DTS
> > may not work if need to import binary data(not a fixed length column) with
> > quotes etc. What I am thinking is to export to another database and back
> it
> > up/archive on a regular basis. That way it will be easy to restore when
> > needed. Any other options?
> >
> > Thanks
> > BVR
> >
> >
>
>

Sunday, March 25, 2012

Archiving

What's the best way to archive tables with binary data? Flat files/bcp/DTS
may not work if need to import binary data(not a fixed length column) with
quotes etc. What I am thinking is to export to another database and back it
up/archive on a regular basis. That way it will be easy to restore when
needed. Any other options?
Thanks
BVR
Vyas has a great article on his web site about archiving.
http://vyaskn.tripod.com/
"Uhway" <vbhadharla@.sbcglobal.net> wrote in message
news:OwmojqdKFHA.1476@.TK2MSFTNGP09.phx.gbl...
> What's the best way to archive tables with binary data? Flat
files/bcp/DTS
> may not work if need to import binary data(not a fixed length column) with
> quotes etc. What I am thinking is to export to another database and back
it
> up/archive on a regular basis. That way it will be easy to restore when
> needed. Any other options?
> Thanks
> BVR
>
|||Here's the link: http://vyaskn.tripod.com/sql_archive_data.htm
HTH,
Vyas, MVP (SQL Server)
SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
"Uri Dimant" <urid@.iscar.co.il> wrote in message
news:Ogj%23SXfKFHA.3332@.TK2MSFTNGP15.phx.gbl...
Vyas has a great article on his web site about archiving.
http://vyaskn.tripod.com/
"Uhway" <vbhadharla@.sbcglobal.net> wrote in message
news:OwmojqdKFHA.1476@.TK2MSFTNGP09.phx.gbl...
> What's the best way to archive tables with binary data? Flat
files/bcp/DTS
> may not work if need to import binary data(not a fixed length column) with
> quotes etc. What I am thinking is to export to another database and back
it
> up/archive on a regular basis. That way it will be easy to restore when
> needed. Any other options?
> Thanks
> BVR
>
|||Narayanan,
I saw your Link, it looks good but i think it may or may not work if the
orders for 6 months is more than 100,000.
What happens if the data is more than 100,000 as we have in a real-time
system.
my solution was (lets take 6 months period - 100,000 records)
If you have to leave 10,000 on live production database and remove the rest
to history database.
This is how I did and its working great.
1. Check the record count of the database based on date.
2. Use Chunking method - means copy 1000-5000 every time depending on the
load from the live database to temp archive database on the live server then
call another str proc to copy data from temp archive to remote archive
database server.
(reason being if network goes down btw live server/remote server [reason
unknown] you can still have data in temp not lost and can come back redo the
data.)
3. once copied to remote archive server, now come back delete that data from
the live database by "delete from live where seqno in temp archive database"
and later delete from temp archive. (seqno is a primary key field)
I go beyond this, once all the data has been copied, i check for the first
and last copied to archive server exists meaning it has successfully copied.
I wrote a program to do this, the program just calls 3 stored procedures and
str proc does all the above function.
Too much isn't it.
Arun
"Narayana Vyas Kondreddi" wrote:

> Here's the link: http://vyaskn.tripod.com/sql_archive_data.htm
> --
> HTH,
> Vyas, MVP (SQL Server)
> SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
>
> "Uri Dimant" <urid@.iscar.co.il> wrote in message
> news:Ogj%23SXfKFHA.3332@.TK2MSFTNGP15.phx.gbl...
>
> Vyas has a great article on his web site about archiving.
> http://vyaskn.tripod.com/
>
>
> "Uhway" <vbhadharla@.sbcglobal.net> wrote in message
> news:OwmojqdKFHA.1476@.TK2MSFTNGP09.phx.gbl...
> files/bcp/DTS
> it
>
>

Archiving

What's the best way to archive tables with binary data? Flat files/bcp/DTS
may not work if need to import binary data(not a fixed length column) with
quotes etc. What I am thinking is to export to another database and back it
up/archive on a regular basis. That way it will be easy to restore when
needed. Any other options?
Thanks
BVRWhy not just back up the database itself?
One thing you could do is put the tables w/ binary data on their own
filegroup -- then just back up that filegroup more regularly -- if that's
what you need?
Adam Machanic
SQL Server MVP
http://www.sqljunkies.com/weblog/amachanic
--
"Uhway" <vbhadharla@.sbcglobal.net> wrote in message
news:eMjAtpdKFHA.3552@.TK2MSFTNGP12.phx.gbl...
> What's the best way to archive tables with binary data? Flat
files/bcp/DTS
> may not work if need to import binary data(not a fixed length column) with
> quotes etc. What I am thinking is to export to another database and back
it
> up/archive on a regular basis. That way it will be easy to restore when
> needed. Any other options?
> Thanks
> BVR
>sql

Archiving

What's the best way to archive tables with binary data? Flat files/bcp/DTS
may not work if need to import binary data(not a fixed length column) with
quotes etc. What I am thinking is to export to another database and back it
up/archive on a regular basis. That way it will be easy to restore when
needed. Any other options?
Thanks
BVRVyas has a great article on his web site about archiving.
http://vyaskn.tripod.com/
"Uhway" <vbhadharla@.sbcglobal.net> wrote in message
news:OwmojqdKFHA.1476@.TK2MSFTNGP09.phx.gbl...
> What's the best way to archive tables with binary data? Flat
files/bcp/DTS
> may not work if need to import binary data(not a fixed length column) with
> quotes etc. What I am thinking is to export to another database and back
it
> up/archive on a regular basis. That way it will be easy to restore when
> needed. Any other options?
> Thanks
> BVR
>|||Here's the link: http://vyaskn.tripod.com/sql_archive_data.htm
--
HTH,
Vyas, MVP (SQL Server)
SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
"Uri Dimant" <urid@.iscar.co.il> wrote in message
news:Ogj%23SXfKFHA.3332@.TK2MSFTNGP15.phx.gbl...
Vyas has a great article on his web site about archiving.
http://vyaskn.tripod.com/
"Uhway" <vbhadharla@.sbcglobal.net> wrote in message
news:OwmojqdKFHA.1476@.TK2MSFTNGP09.phx.gbl...
> What's the best way to archive tables with binary data? Flat
files/bcp/DTS
> may not work if need to import binary data(not a fixed length column) with
> quotes etc. What I am thinking is to export to another database and back
it
> up/archive on a regular basis. That way it will be easy to restore when
> needed. Any other options?
> Thanks
> BVR
>|||Narayanan,
I saw your Link, it looks good but i think it may or may not work if the
orders for 6 months is more than 100,000.
What happens if the data is more than 100,000 as we have in a real-time
system.
my solution was (lets take 6 months period - 100,000 records)
If you have to leave 10,000 on live production database and remove the rest
to history database.
This is how I did and its working great.
1. Check the record count of the database based on date.
2. Use Chunking method - means copy 1000-5000 every time depending on the
load from the live database to temp archive database on the live server then
call another str proc to copy data from temp archive to remote archive
database server.
(reason being if network goes down btw live server/remote server [reason
unknown] you can still have data in temp not lost and can come back redo the
data.)
3. once copied to remote archive server, now come back delete that data from
the live database by "delete from live where seqno in temp archive database"
and later delete from temp archive. (seqno is a primary key field)
I go beyond this, once all the data has been copied, i check for the first
and last copied to archive server exists meaning it has successfully copied.
I wrote a program to do this, the program just calls 3 stored procedures and
str proc does all the above function.
Too much isn't it.
Arun
"Narayana Vyas Kondreddi" wrote:

> Here's the link: http://vyaskn.tripod.com/sql_archive_data.htm
> --
> HTH,
> Vyas, MVP (SQL Server)
> SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
>
> "Uri Dimant" <urid@.iscar.co.il> wrote in message
> news:Ogj%23SXfKFHA.3332@.TK2MSFTNGP15.phx.gbl...
>
> Vyas has a great article on his web site about archiving.
> http://vyaskn.tripod.com/
>
>
> "Uhway" <vbhadharla@.sbcglobal.net> wrote in message
> news:OwmojqdKFHA.1476@.TK2MSFTNGP09.phx.gbl...
> files/bcp/DTS
> it
>
>

Thursday, March 8, 2012

Application Security

We have a Visual Basic 5 .exe that is used to launch an application process
to import files from our application server to update the database server.
Both servers are Windows 2003 Server Standard Edition. The application laun
ches MS Access 2000 and imp
orts files into an Access database table and then connects to our database s
erver hosting SQL Server 2000 Sp3 to update a master table of users.
The VB app is launched with a local account on the app server and uses a reg
istry value to get the database connection string using SQL Authentication.
We have noticed an authentication error message (Failure Event ID 529) on t
he SQL server for the accou
nt launching the scheduled task that runs the imports. The error only occur
s on the SQL server when a load occurs on the application server. The load
occurs when users run a Web .ASP application and run end of month procedures
. It connects to the same
SQL server with the same SQL Authenticated credentials.
The import VB application does not start to authenticate unless a load is pr
esent on the server. Why would the security context change and go away from
SQL authentication' The work around so far is to use a domain account or
mirrored account on the SQL
server.
Thanks.It sounds like the VB application is using a connection string that is
requesting Windows Authentication. This is why it works if you
use a domain account or duplicate the user account and passwords. Check
the Security tab in Enterprise manager and verify that you are allowing
both SQL and Windows Authentication. Enable auditing for both failed and
successfull logins and test using both standard and Windows authentication
to validate the logging.
Thanks,
Kevin McDonnell
Microsoft Corporation
This posting is provided AS IS with no warranties, and confers no rights.