Showing posts with label guys. Show all posts
Showing posts with label guys. Show all posts

Tuesday, March 27, 2012

Archiving very slow, help needed please

Hey guys,
The vendor gave us an archiving tool for our huge tables. We gave them a
production copy of our tables. They tested their tool against their own
server and told us that it takes 2 seconds to insert each record. I know
this is already bad. But today, I tested it against our own test server.
They also gave us instruction on what to do first before running the
archiving tool. Well anyways, after running the archiving tool, it took 20
seconds to insert 1 record. That's totally bad!
I would like to know if you will be able to help me guys identify the
problem by just looking at this links.
http://restricted.dyndns.org/executionplan.txt
http://restricted.dyndns.org/execplan1.gif
http://restricted.dyndns.org/execplan2.gif
http://restricted.dyndns.org/execplan3.gif
The INSERT statement that you will see there consumed 10 seconds of CPU and
30 seconds of Duration.
Are there any other statements that I can execute against my captured
profiler table that can us troubleshoot?
Any help will be greatly appreciated.
Thanks.
V1rt>> told us that it takes 2 seconds to insert each record
well....
Go back to the vendor (I assume you haven't paid them yet) and tell them
this isn't acceptable.
How many records do you have to deal with - even at 2 secs per record?
"Neil" wrote:
> Hey guys,
> The vendor gave us an archiving tool for our huge tables. We gave them a
> production copy of our tables. They tested their tool against their own
> server and told us that it takes 2 seconds to insert each record. I know
> this is already bad. But today, I tested it against our own test server.
> They also gave us instruction on what to do first before running the
> archiving tool. Well anyways, after running the archiving tool, it took 20
> seconds to insert 1 record. That's totally bad!
> I would like to know if you will be able to help me guys identify the
> problem by just looking at this links.
> http://restricted.dyndns.org/executionplan.txt
> http://restricted.dyndns.org/execplan1.gif
> http://restricted.dyndns.org/execplan2.gif
> http://restricted.dyndns.org/execplan3.gif
> The INSERT statement that you will see there consumed 10 seconds of CPU and
> 30 seconds of Duration.
> Are there any other statements that I can execute against my captured
> profiler table that can us troubleshoot?
> Any help will be greatly appreciated.
> Thanks.
> V1rt
>
>|||It looks like you are filtering the source table, Enclosure. Nothing wrong
with that. However, then you are joining this to the destination table.
Why?
An archive table usually doesn't contain the data in it yet. Are you
joining to make sure that you don't attempt to archive data that's already
been copied?
If so, there are better ways to write this. If you are INSERTing into a
table, not an update, there is rarely a need to join the destination to the
source. This will only slow you down because an INSERT will usually involve
updating the joined columns.
Try something more like this:
BEGIN TRANSACTION
INSERT INTO TableDestination
(Column1, Column2, ..., ColumnN)
SELECT Column1, Column2, ..., ColumnN
FROM TableSource
WHERE TableSource.ColumnX = ExpressionX
IF @.@.ERROR <> 0 BEGIN
ROLLBACK TRANSACTION
RETURN
END
DELETE FROM s
FROM TableDestination AS d
INNER JOIN TableSource AS s
ON d.Key1 = s.Key1
AND d.Key2 = s.Key2
AND...
AND d.KeyN = s.KeyN
IF @.@.ERROR <> 0 BEGIN
ROLLBACK TRANSACTION
RETURN
END
COMMIT TRANSACTION
Sincerely,
Anthony Thomas
"Nigel Rivett" <sqlnr@.hotmail.com> wrote in message
news:B085D060-8557-4C9E-B632-0C320BE1CE98@.microsoft.com...
> >> told us that it takes 2 seconds to insert each record
> well....
> Go back to the vendor (I assume you haven't paid them yet) and tell them
> this isn't acceptable.
> How many records do you have to deal with - even at 2 secs per record?
>
> "Neil" wrote:
> > Hey guys,
> >
> > The vendor gave us an archiving tool for our huge tables. We gave them a
> > production copy of our tables. They tested their tool against their own
> > server and told us that it takes 2 seconds to insert each record. I know
> > this is already bad. But today, I tested it against our own test server.
> > They also gave us instruction on what to do first before running the
> > archiving tool. Well anyways, after running the archiving tool, it took
20
> > seconds to insert 1 record. That's totally bad!
> >
> > I would like to know if you will be able to help me guys identify the
> > problem by just looking at this links.
> >
> > http://restricted.dyndns.org/executionplan.txt
> > http://restricted.dyndns.org/execplan1.gif
> > http://restricted.dyndns.org/execplan2.gif
> > http://restricted.dyndns.org/execplan3.gif
> >
> > The INSERT statement that you will see there consumed 10 seconds of CPU
and
> > 30 seconds of Duration.
> >
> > Are there any other statements that I can execute against my captured
> > profiler table that can us troubleshoot?
> >
> > Any help will be greatly appreciated.
> >
> > Thanks.
> >
> > V1rt
> >
> >
> >|||Hi Anthony,
Thanks for the awesome reply. Am I correct that the destination table that
is being joint is the tablefieldaudit? That's what I saw in the INSERT
statement.
Below is what I captured using Profiler. I saw tons of this running for 20+
seconds each. :(
INSERT INTO
TABLEFIELDAUDIT(TABLENAME,FIELDNAME,FIELDVALUE,CHANGEDATE,KEYVALUE,USERID,SUBKEY1,
SUBKEY2)
SELECT TABLENAME,FIELDNAME,FIELDVALUE,CHANGEDATE,E.RECORDID AS
KEYVALUE,USERID,SUBKEY1, SUBKEY2
FROM TABLEFIELDAUDIT
INNER JOIN ENCLOSURE E
ON TABLEFIELDAUDIT.SUBKEY1=E.BARCODE AND
TABLEFIELDAUDIT.SUBKEY2=E.ENCLOSURENUMBER
WHERE TABLENAME='ENCLOSURE' AND SUBKEY1='00010690'
Thanks again,
Neil
"AnthonyThomas" <Anthony.Thomas@.CommerceBank.com> wrote in message
news:OLmPhvtyEHA.2568@.TK2MSFTNGP11.phx.gbl...
> It looks like you are filtering the source table, Enclosure. Nothing
> wrong
> with that. However, then you are joining this to the destination table.
> Why?
> An archive table usually doesn't contain the data in it yet. Are you
> joining to make sure that you don't attempt to archive data that's already
> been copied?
> If so, there are better ways to write this. If you are INSERTing into a
> table, not an update, there is rarely a need to join the destination to
> the
> source. This will only slow you down because an INSERT will usually
> involve
> updating the joined columns.
> Try something more like this:
> BEGIN TRANSACTION
> INSERT INTO TableDestination
> (Column1, Column2, ..., ColumnN)
> SELECT Column1, Column2, ..., ColumnN
> FROM TableSource
> WHERE TableSource.ColumnX = ExpressionX
> IF @.@.ERROR <> 0 BEGIN
> ROLLBACK TRANSACTION
> RETURN
> END
> DELETE FROM s
> FROM TableDestination AS d
> INNER JOIN TableSource AS s
> ON d.Key1 = s.Key1
> AND d.Key2 = s.Key2
> AND...
> AND d.KeyN = s.KeyN
> IF @.@.ERROR <> 0 BEGIN
> ROLLBACK TRANSACTION
> RETURN
> END
> COMMIT TRANSACTION
> Sincerely,
>
> Anthony Thomas
>
> --
> "Nigel Rivett" <sqlnr@.hotmail.com> wrote in message
> news:B085D060-8557-4C9E-B632-0C320BE1CE98@.microsoft.com...
>> >> told us that it takes 2 seconds to insert each record
>> well....
>> Go back to the vendor (I assume you haven't paid them yet) and tell them
>> this isn't acceptable.
>> How many records do you have to deal with - even at 2 secs per record?
>>
>> "Neil" wrote:
>> > Hey guys,
>> >
>> > The vendor gave us an archiving tool for our huge tables. We gave them
>> > a
>> > production copy of our tables. They tested their tool against their own
>> > server and told us that it takes 2 seconds to insert each record. I
>> > know
>> > this is already bad. But today, I tested it against our own test
>> > server.
>> > They also gave us instruction on what to do first before running the
>> > archiving tool. Well anyways, after running the archiving tool, it took
> 20
>> > seconds to insert 1 record. That's totally bad!
>> >
>> > I would like to know if you will be able to help me guys identify the
>> > problem by just looking at this links.
>> >
>> > http://restricted.dyndns.org/executionplan.txt
>> > http://restricted.dyndns.org/execplan1.gif
>> > http://restricted.dyndns.org/execplan2.gif
>> > http://restricted.dyndns.org/execplan3.gif
>> >
>> > The INSERT statement that you will see there consumed 10 seconds of CPU
> and
>> > 30 seconds of Duration.
>> >
>> > Are there any other statements that I can execute against my captured
>> > profiler table that can us troubleshoot?
>> >
>> > Any help will be greatly appreciated.
>> >
>> > Thanks.
>> >
>> > V1rt
>> >
>> >
>> >
>

Archiving very slow, help needed please

Hey guys,
The vendor gave us an archiving tool for our huge tables. We gave them a
production copy of our tables. They tested their tool against their own
server and told us that it takes 2 seconds to insert each record. I know
this is already bad. But today, I tested it against our own test server.
They also gave us instruction on what to do first before running the
archiving tool. Well anyways, after running the archiving tool, it took 20
seconds to insert 1 record. That's totally bad!
I would like to know if you will be able to help me guys identify the
problem by just looking at this links.
http://restricted.dyndns.org/executionplan.txt
http://restricted.dyndns.org/execplan1.gif
http://restricted.dyndns.org/execplan2.gif
http://restricted.dyndns.org/execplan3.gif
The INSERT statement that you will see there consumed 10 seconds of CPU and
30 seconds of Duration.
Are there any other statements that I can execute against my captured
profiler table that can us troubleshoot?
Any help will be greatly appreciated.
Thanks.
V1rt
>> told us that it takes 2 seconds to insert each record
well....
Go back to the vendor (I assume you haven't paid them yet) and tell them
this isn't acceptable.
How many records do you have to deal with - even at 2 secs per record?
"Neil" wrote:

> Hey guys,
> The vendor gave us an archiving tool for our huge tables. We gave them a
> production copy of our tables. They tested their tool against their own
> server and told us that it takes 2 seconds to insert each record. I know
> this is already bad. But today, I tested it against our own test server.
> They also gave us instruction on what to do first before running the
> archiving tool. Well anyways, after running the archiving tool, it took 20
> seconds to insert 1 record. That's totally bad!
> I would like to know if you will be able to help me guys identify the
> problem by just looking at this links.
> http://restricted.dyndns.org/executionplan.txt
> http://restricted.dyndns.org/execplan1.gif
> http://restricted.dyndns.org/execplan2.gif
> http://restricted.dyndns.org/execplan3.gif
> The INSERT statement that you will see there consumed 10 seconds of CPU and
> 30 seconds of Duration.
> Are there any other statements that I can execute against my captured
> profiler table that can us troubleshoot?
> Any help will be greatly appreciated.
> Thanks.
> V1rt
>
>
|||It looks like you are filtering the source table, Enclosure. Nothing wrong
with that. However, then you are joining this to the destination table.
Why?
An archive table usually doesn't contain the data in it yet. Are you
joining to make sure that you don't attempt to archive data that's already
been copied?
If so, there are better ways to write this. If you are INSERTing into a
table, not an update, there is rarely a need to join the destination to the
source. This will only slow you down because an INSERT will usually involve
updating the joined columns.
Try something more like this:
BEGIN TRANSACTION
INSERT INTO TableDestination
(Column1, Column2, ..., ColumnN)
SELECT Column1, Column2, ..., ColumnN
FROM TableSource
WHERE TableSource.ColumnX = ExpressionX
IF @.@.ERROR <> 0 BEGIN
ROLLBACK TRANSACTION
RETURN
END
DELETE FROM s
FROM TableDestination AS d
INNER JOIN TableSource AS s
ON d.Key1 = s.Key1
AND d.Key2 = s.Key2
AND...
AND d.KeyN = s.KeyN
IF @.@.ERROR <> 0 BEGIN
ROLLBACK TRANSACTION
RETURN
END
COMMIT TRANSACTION
Sincerely,
Anthony Thomas

"Nigel Rivett" <sqlnr@.hotmail.com> wrote in message
news:B085D060-8557-4C9E-B632-0C320BE1CE98@.microsoft.com...[vbcol=seagreen]
> well....
> Go back to the vendor (I assume you haven't paid them yet) and tell them
> this isn't acceptable.
> How many records do you have to deal with - even at 2 secs per record?
>
> "Neil" wrote:
20[vbcol=seagreen]
and[vbcol=seagreen]
|||Hi Anthony,
Thanks for the awesome reply. Am I correct that the destination table that
is being joint is the tablefieldaudit? That's what I saw in the INSERT
statement.
Below is what I captured using Profiler. I saw tons of this running for 20+
seconds each.
INSERT INTO
TABLEFIELDAUDIT(TABLENAME,FIELDNAME,FIELDVALUE,CHA NGEDATE,KEYVALUE,USERID,SUBKEY1,
SUBKEY2)
SELECT TABLENAME,FIELDNAME,FIELDVALUE,CHANGEDATE,E.RECORD ID AS
KEYVALUE,USERID,SUBKEY1, SUBKEY2
FROM TABLEFIELDAUDIT
INNER JOIN ENCLOSURE E
ON TABLEFIELDAUDIT.SUBKEY1=E.BARCODE AND
TABLEFIELDAUDIT.SUBKEY2=E.ENCLOSURENUMBER
WHERE TABLENAME='ENCLOSURE' AND SUBKEY1='00010690'
Thanks again,
Neil
"AnthonyThomas" <Anthony.Thomas@.CommerceBank.com> wrote in message
news:OLmPhvtyEHA.2568@.TK2MSFTNGP11.phx.gbl...
> It looks like you are filtering the source table, Enclosure. Nothing
> wrong
> with that. However, then you are joining this to the destination table.
> Why?
> An archive table usually doesn't contain the data in it yet. Are you
> joining to make sure that you don't attempt to archive data that's already
> been copied?
> If so, there are better ways to write this. If you are INSERTing into a
> table, not an update, there is rarely a need to join the destination to
> the
> source. This will only slow you down because an INSERT will usually
> involve
> updating the joined columns.
> Try something more like this:
> BEGIN TRANSACTION
> INSERT INTO TableDestination
> (Column1, Column2, ..., ColumnN)
> SELECT Column1, Column2, ..., ColumnN
> FROM TableSource
> WHERE TableSource.ColumnX = ExpressionX
> IF @.@.ERROR <> 0 BEGIN
> ROLLBACK TRANSACTION
> RETURN
> END
> DELETE FROM s
> FROM TableDestination AS d
> INNER JOIN TableSource AS s
> ON d.Key1 = s.Key1
> AND d.Key2 = s.Key2
> AND...
> AND d.KeyN = s.KeyN
> IF @.@.ERROR <> 0 BEGIN
> ROLLBACK TRANSACTION
> RETURN
> END
> COMMIT TRANSACTION
> Sincerely,
>
> Anthony Thomas
>
> --
> "Nigel Rivett" <sqlnr@.hotmail.com> wrote in message
> news:B085D060-8557-4C9E-B632-0C320BE1CE98@.microsoft.com...
> 20
> and
>

Archiving very slow, help needed please

Hey guys,
The vendor gave us an archiving tool for our huge tables. We gave them a
production copy of our tables. They tested their tool against their own
server and told us that it takes 2 seconds to insert each record. I know
this is already bad. But today, I tested it against our own test server.
They also gave us instruction on what to do first before running the
archiving tool. Well anyways, after running the archiving tool, it took 20
seconds to insert 1 record. That's totally bad!
I would like to know if you will be able to help me guys identify the
problem by just looking at this links.
http://restricted.dyndns.org/executionplan.txt
http://restricted.dyndns.org/execplan1.gif
http://restricted.dyndns.org/execplan2.gif
http://restricted.dyndns.org/execplan3.gif
The INSERT statement that you will see there consumed 10 seconds of CPU and
30 seconds of Duration.
Are there any other statements that I can execute against my captured
profiler table that can us troubleshoot?
Any help will be greatly appreciated.
Thanks.
V1rt>> told us that it takes 2 seconds to insert each record
well....
Go back to the vendor (I assume you haven't paid them yet) and tell them
this isn't acceptable.
How many records do you have to deal with - even at 2 secs per record?
"Neil" wrote:

> Hey guys,
> The vendor gave us an archiving tool for our huge tables. We gave them a
> production copy of our tables. They tested their tool against their own
> server and told us that it takes 2 seconds to insert each record. I know
> this is already bad. But today, I tested it against our own test server.
> They also gave us instruction on what to do first before running the
> archiving tool. Well anyways, after running the archiving tool, it took 20
> seconds to insert 1 record. That's totally bad!
> I would like to know if you will be able to help me guys identify the
> problem by just looking at this links.
> http://restricted.dyndns.org/executionplan.txt
> http://restricted.dyndns.org/execplan1.gif
> http://restricted.dyndns.org/execplan2.gif
> http://restricted.dyndns.org/execplan3.gif
> The INSERT statement that you will see there consumed 10 seconds of CPU an
d
> 30 seconds of Duration.
> Are there any other statements that I can execute against my captured
> profiler table that can us troubleshoot?
> Any help will be greatly appreciated.
> Thanks.
> V1rt
>
>|||It looks like you are filtering the source table, Enclosure. Nothing wrong
with that. However, then you are joining this to the destination table.
Why?
An archive table usually doesn't contain the data in it yet. Are you
joining to make sure that you don't attempt to archive data that's already
been copied?
If so, there are better ways to write this. If you are INSERTing into a
table, not an update, there is rarely a need to join the destination to the
source. This will only slow you down because an INSERT will usually involve
updating the joined columns.
Try something more like this:
BEGIN TRANSACTION
INSERT INTO TableDestination
(Column1, Column2, ..., ColumnN)
SELECT Column1, Column2, ..., ColumnN
FROM TableSource
WHERE TableSource.ColumnX = ExpressionX
IF @.@.ERROR <> 0 BEGIN
ROLLBACK TRANSACTION
RETURN
END
DELETE FROM s
FROM TableDestination AS d
INNER JOIN TableSource AS s
ON d.Key1 = s.Key1
AND d.Key2 = s.Key2
AND...
AND d.KeyN = s.KeyN
IF @.@.ERROR <> 0 BEGIN
ROLLBACK TRANSACTION
RETURN
END
COMMIT TRANSACTION
Sincerely,
Anthony Thomas
"Nigel Rivett" <sqlnr@.hotmail.com> wrote in message
news:B085D060-8557-4C9E-B632-0C320BE1CE98@.microsoft.com...[vbcol=seagreen]
> well....
> Go back to the vendor (I assume you haven't paid them yet) and tell them
> this isn't acceptable.
> How many records do you have to deal with - even at 2 secs per record?
>
> "Neil" wrote:
>
20[vbcol=seagreen]
and[vbcol=seagreen]|||Hi Anthony,
Thanks for the awesome reply. Am I correct that the destination table that
is being joint is the tablefieldaudit? That's what I saw in the INSERT
statement.
Below is what I captured using Profiler. I saw tons of this running for 20+
seconds each.
INSERT INTO
TABLEFIELDAUDIT(TABLENAME,FIELDNAME,FIEL
DVALUE,CHANGEDATE,KEYVALUE,USERID,SU
BKEY1,
SUBKEY2)
SELECT TABLENAME,FIELDNAME,FIELDVALUE,CHANGEDAT
E,E.RECORDID AS
KEYVALUE,USERID,SUBKEY1, SUBKEY2
FROM TABLEFIELDAUDIT
INNER JOIN ENCLOSURE E
ON TABLEFIELDAUDIT.SUBKEY1=E.BARCODE AND
TABLEFIELDAUDIT.SUBKEY2=E.ENCLOSURENUMBER
WHERE TABLENAME='ENCLOSURE' AND SUBKEY1='00010690'
Thanks again,
Neil
"AnthonyThomas" <Anthony.Thomas@.CommerceBank.com> wrote in message
news:OLmPhvtyEHA.2568@.TK2MSFTNGP11.phx.gbl...
> It looks like you are filtering the source table, Enclosure. Nothing
> wrong
> with that. However, then you are joining this to the destination table.
> Why?
> An archive table usually doesn't contain the data in it yet. Are you
> joining to make sure that you don't attempt to archive data that's already
> been copied?
> If so, there are better ways to write this. If you are INSERTing into a
> table, not an update, there is rarely a need to join the destination to
> the
> source. This will only slow you down because an INSERT will usually
> involve
> updating the joined columns.
> Try something more like this:
> BEGIN TRANSACTION
> INSERT INTO TableDestination
> (Column1, Column2, ..., ColumnN)
> SELECT Column1, Column2, ..., ColumnN
> FROM TableSource
> WHERE TableSource.ColumnX = ExpressionX
> IF @.@.ERROR <> 0 BEGIN
> ROLLBACK TRANSACTION
> RETURN
> END
> DELETE FROM s
> FROM TableDestination AS d
> INNER JOIN TableSource AS s
> ON d.Key1 = s.Key1
> AND d.Key2 = s.Key2
> AND...
> AND d.KeyN = s.KeyN
> IF @.@.ERROR <> 0 BEGIN
> ROLLBACK TRANSACTION
> RETURN
> END
> COMMIT TRANSACTION
> Sincerely,
>
> Anthony Thomas
>
> --
> "Nigel Rivett" <sqlnr@.hotmail.com> wrote in message
> news:B085D060-8557-4C9E-B632-0C320BE1CE98@.microsoft.com...
> 20
> and
>

archiving

Hey guys,
I would like to know how sql archiving is written. What is the best approach
to this such as do this, don't do this stuff? Sample codes will be very
helpful too.
Thanks.
NeilNeil
Look at Vyas's example
CREATE PROC dbo.ArchiveData
(
@.CutOffDate datetime = NULL
)
AS
BEGIN
SET NOCOUNT ON
IF @.CutOffDate IS NULL
BEGIN
SET @.CutOffDate = DATEADD(mm, -6, CURRENT_TIMESTAMP)
END
ELSE
BEGIN
IF @.CutOffDate > DATEADD(mm, -3, CURRENT_TIMESTAMP)
BEGIN
RAISERROR ('Cannot delete orders from last three months', 16, 1)
RETURN -1
END
END
BEGIN TRAN
INSERT INTO Archive.dbo.Orders
SELECT *
FROM dbo.Orders
WHERE OrderDate < @.CutOffDate
IF @.@.ERROR <> 0
BEGIN
ROLLBACK TRAN
RAISERROR ('Error occured while copying data to Archive.dbo.Orders', 16,
1)
RETURN -1
END
INSERT INTO Archive.dbo.OrderDetails
SELECT *
FROM dbo.OrderDetails
WHERE OrderID IN
(
SELECT OrderID
FROM dbo.Orders
WHERE OrderDate < @.CutOffDate
)
IF @.@.ERROR <> 0
BEGIN
ROLLBACK TRAN
RAISERROR ('Error occured while copying data to
Archive.dbo.OrderDetails', 16, 1)
RETURN -1
END
DELETE dbo.OrderDetails
WHERE OrderID IN
(
SELECT OrderID
FROM dbo.Orders
WHERE OrderDate < @.CutOffDate
)
IF @.@.ERROR <> 0
BEGIN
ROLLBACK TRAN
RAISERROR ('Error occured while deleting data from dbo.OrderDetails', 16,
1)
RETURN -1
END
DELETE dbo.Orders
WHERE OrderDate < @.CutOffDate
IF @.@.ERROR <> 0
BEGIN
ROLLBACK TRAN
RAISERROR ('Error occured while deleting data from dbo.Orders', 16, 1)
RETURN -1
END
IF @.@.TRANCOUNT > 0
BEGIN
COMMIT TRAN
RETURN 0
END
END
"Neil" <neil-on-ht@.restricted.dyndns.org> wrote in message
news:eaaruzCvEHA.3948@.TK2MSFTNGP15.phx.gbl...
> Hey guys,
> I would like to know how sql archiving is written. What is the best
approach
> to this such as do this, don't do this stuff? Sample codes will be very
> helpful too.
> Thanks.
> Neil
>sql

Sunday, March 25, 2012

archiving

Hey guys,
I would like to know how sql archiving is written. What is the best approach
to this such as do this, don't do this stuff? Sample codes will be very
helpful too.
Thanks.
Neil
Neil
Look at Vyas's example
CREATE PROC dbo.ArchiveData
(
@.CutOffDate datetime = NULL
)
AS
BEGIN
SET NOCOUNT ON
IF @.CutOffDate IS NULL
BEGIN
SET @.CutOffDate = DATEADD(mm, -6, CURRENT_TIMESTAMP)
END
ELSE
BEGIN
IF @.CutOffDate > DATEADD(mm, -3, CURRENT_TIMESTAMP)
BEGIN
RAISERROR ('Cannot delete orders from last three months', 16, 1)
RETURN -1
END
END
BEGIN TRAN
INSERT INTO Archive.dbo.Orders
SELECT *
FROM dbo.Orders
WHERE OrderDate < @.CutOffDate
IF @.@.ERROR <> 0
BEGIN
ROLLBACK TRAN
RAISERROR ('Error occured while copying data to Archive.dbo.Orders', 16,
1)
RETURN -1
END
INSERT INTO Archive.dbo.OrderDetails
SELECT *
FROM dbo.OrderDetails
WHERE OrderID IN
(
SELECT OrderID
FROM dbo.Orders
WHERE OrderDate < @.CutOffDate
)
IF @.@.ERROR <> 0
BEGIN
ROLLBACK TRAN
RAISERROR ('Error occured while copying data to
Archive.dbo.OrderDetails', 16, 1)
RETURN -1
END
DELETE dbo.OrderDetails
WHERE OrderID IN
(
SELECT OrderID
FROM dbo.Orders
WHERE OrderDate < @.CutOffDate
)
IF @.@.ERROR <> 0
BEGIN
ROLLBACK TRAN
RAISERROR ('Error occured while deleting data from dbo.OrderDetails', 16,
1)
RETURN -1
END
DELETE dbo.Orders
WHERE OrderDate < @.CutOffDate
IF @.@.ERROR <> 0
BEGIN
ROLLBACK TRAN
RAISERROR ('Error occured while deleting data from dbo.Orders', 16, 1)
RETURN -1
END
IF @.@.TRANCOUNT > 0
BEGIN
COMMIT TRAN
RETURN 0
END
END
"Neil" <neil-on-ht@.restricted.dyndns.org> wrote in message
news:eaaruzCvEHA.3948@.TK2MSFTNGP15.phx.gbl...
> Hey guys,
> I would like to know how sql archiving is written. What is the best
approach
> to this such as do this, don't do this stuff? Sample codes will be very
> helpful too.
> Thanks.
> Neil
>

archiving

Hey guys,
I would like to know how sql archiving is written. What is the best approach
to this such as do this, don't do this stuff? Sample codes will be very
helpful too.
Thanks.
NeilNeil
Look at Vyas's example
CREATE PROC dbo.ArchiveData
(
@.CutOffDate datetime = NULL
)
AS
BEGIN
SET NOCOUNT ON
IF @.CutOffDate IS NULL
BEGIN
SET @.CutOffDate = DATEADD(mm, -6, CURRENT_TIMESTAMP)
END
ELSE
BEGIN
IF @.CutOffDate > DATEADD(mm, -3, CURRENT_TIMESTAMP)
BEGIN
RAISERROR ('Cannot delete orders from last three months', 16, 1)
RETURN -1
END
END
BEGIN TRAN
INSERT INTO Archive.dbo.Orders
SELECT *
FROM dbo.Orders
WHERE OrderDate < @.CutOffDate
IF @.@.ERROR <> 0
BEGIN
ROLLBACK TRAN
RAISERROR ('Error occured while copying data to Archive.dbo.Orders', 16,
1)
RETURN -1
END
INSERT INTO Archive.dbo.OrderDetails
SELECT *
FROM dbo.OrderDetails
WHERE OrderID IN
(
SELECT OrderID
FROM dbo.Orders
WHERE OrderDate < @.CutOffDate
)
IF @.@.ERROR <> 0
BEGIN
ROLLBACK TRAN
RAISERROR ('Error occured while copying data to
Archive.dbo.OrderDetails', 16, 1)
RETURN -1
END
DELETE dbo.OrderDetails
WHERE OrderID IN
(
SELECT OrderID
FROM dbo.Orders
WHERE OrderDate < @.CutOffDate
)
IF @.@.ERROR <> 0
BEGIN
ROLLBACK TRAN
RAISERROR ('Error occured while deleting data from dbo.OrderDetails', 16,
1)
RETURN -1
END
DELETE dbo.Orders
WHERE OrderDate < @.CutOffDate
IF @.@.ERROR <> 0
BEGIN
ROLLBACK TRAN
RAISERROR ('Error occured while deleting data from dbo.Orders', 16, 1)
RETURN -1
END
IF @.@.TRANCOUNT > 0
BEGIN
COMMIT TRAN
RETURN 0
END
END
"Neil" <neil-on-ht@.restricted.dyndns.org> wrote in message
news:eaaruzCvEHA.3948@.TK2MSFTNGP15.phx.gbl...
> Hey guys,
> I would like to know how sql archiving is written. What is the best
approach
> to this such as do this, don't do this stuff? Sample codes will be very
> helpful too.
> Thanks.
> Neil
>

Tuesday, March 20, 2012

Applying the snapshot for Merge rep fails everytime because Connection is reset?

Hi guys, Im desperately hoping someone here can help me. Ive been trying to set up merge replication between 2 servers all weekend and it keeps failing. Im setting this up as follows
The Publisher/Distributor is a SQL Server 2005 machine, the database being replicated is 4gb, and the only subscriber is a SQL Server 2000 SP4 machine. I set up my publication normally, and set up the push subscription, adn set it to initialize immediately. It weill then start bulk copying and go on for about 3-4 hours before it fails and gives me the errors:

Error messages:
The merge process was unable to deliver the snapshot to the Subscriber. If using Web synchronization, the merge process may have been unable to create or write to the message file. When troubleshooting, restart the synchronization with verbose history logging and specify an output file to which to write. (Source: MSSQL_REPL, Error number: MSSQL_REPL-2147201001)
Get help: http://help/MSSQL_REPL-2147201001
The process could not bulk copy into table '"dbo"."CONTENT"'. (Source: MSSQL_REPL, Error number: MSSQL_REPL20037)
Get help: http://help/MSSQL_REPL20037
An existing connection was forcibly closed by the remote host. (Source: MSSQLServer, Error number: 0)
Get help: http://help/0
One or more BLOB columns could not be sent to the server, attempt to recover from the problem failed. (Source: MSSQLServer, Error number: 0)
Get help: http://help/0
Communication link failure (Source: MSSQLServer, Error number: 0)
Get help: http://help/0
Unspecified error (Source: MSSQLServer, Error number: 0)
Get help: http://help/0
Failed to send batch after max errors (Source: MSSQLServer, Error number: 0)

This is driving me crazy as I havent slept in ages trying to get this fixed. Anyone have any idea whats going on? Its always one of 2 tables so far that keep failing, ContactLog and Content, what can I do?

For anyone who finds an error similar to this one, its the line "forcibly closed by the remote host" that was the most interesting. Well, found the solution here, in section 4.1.2, http://support.microsoft.com/kb/910228
Basically, SP1 for Win 2K3 includes a lovely feature to prevent DOS attacks, and rather cleverly it thinks SQL Server applying a snapshot is a DOS attack so closes the connection after a set amount of Connect/Disconnects. Jeebus wept. 1 simple registry key cost me my weekend :-|

Hopefully this will save someone the same heartache!

Keep up the good work guys
|||Hello Shane,

Did you find the solution to your problem if yes then please let me know also because i am also facing the same problem from last two weeks and i am not able to solve it and its causing a headache for me please tell me if u have the solution.
my email id is dr_hunaindurrani@.hotmail.com.

Thanks in advance.

Applying the snapshot for Merge rep fails everytime because Connection is reset?

Hi guys, Im desperately hoping someone here can help me. Ive been trying to set up merge replication between 2 servers all weekend and it keeps failing. Im setting this up as follows
The Publisher/Distributor is a SQL Server 2005 machine, the database being replicated is 4gb, and the only subscriber is a SQL Server 2000 SP4 machine. I set up my publication normally, and set up the push subscription, adn set it to initialize immediately. It weill then start bulk copying and go on for about 3-4 hours before it fails and gives me the errors:

Error messages:
The merge process was unable to deliver the snapshot to the Subscriber. If using Web synchronization, the merge process may have been unable to create or write to the message file. When troubleshooting, restart the synchronization with verbose history logging and specify an output file to which to write. (Source: MSSQL_REPL, Error number: MSSQL_REPL-2147201001)
Get help: http://help/MSSQL_REPL-2147201001
The process could not bulk copy into table '"dbo"."CONTENT"'. (Source: MSSQL_REPL, Error number: MSSQL_REPL20037)
Get help: http://help/MSSQL_REPL20037
An existing connection was forcibly closed by the remote host. (Source: MSSQLServer, Error number: 0)
Get help: http://help/0
One or more BLOB columns could not be sent to the server, attempt to recover from the problem failed. (Source: MSSQLServer, Error number: 0)
Get help: http://help/0
Communication link failure (Source: MSSQLServer, Error number: 0)
Get help: http://help/0
Unspecified error (Source: MSSQLServer, Error number: 0)
Get help: http://help/0
Failed to send batch after max errors (Source: MSSQLServer, Error number: 0)

This is driving me crazy as I havent slept in ages trying to get this fixed. Anyone have any idea whats going on? Its always one of 2 tables so far that keep failing, ContactLog and Content, what can I do?

For anyone who finds an error similar to this one, its the line "forcibly closed by the remote host" that was the most interesting. Well, found the solution here, in section 4.1.2, http://support.microsoft.com/kb/910228
Basically, SP1 for Win 2K3 includes a lovely feature to prevent DOS attacks, and rather cleverly it thinks SQL Server applying a snapshot is a DOS attack so closes the connection after a set amount of Connect/Disconnects. Jeebus wept. 1 simple registry key cost me my weekend :-|

Hopefully this will save someone the same heartache!

Keep up the good work guys
|||Hello Shane,

Did you find the solution to your problem if yes then please let me know also because i am also facing the same problem from last two weeks and i am not able to solve it and its causing a headache for me please tell me if u have the solution.
my email id is dr_hunaindurrani@.hotmail.com.

Thanks in advance.

Sunday, March 11, 2012

Apply permissions to user, role or group?

Hi guys,

I'd appreciate your thoughts on this. Not done too much DB admin. Usually I do development work and the admins to the admin.

The database is behind an API of stored procedures to manipulate the data, and views to select the data.

The database needs to be accessed remotely by multiple clients.

How best to keep the database secure?

Create a new user and login on the database which is made known to all client applications. Then grant execute permission on the stored procs and grant select on the views?

There is probably a better way than one login for all? Should I be looking at roles and groups etc? If so, how best to set that up?

A few pointers would be gratefully received!

What are you trying to protect and from whom? Who should have access to what? What kinds of access do you want to allow? You should start by asking yourself such questions and once you gather the answers, you can start designing your application security to enforce these access restrictions.

There is no best database security model - if you don't have anything to protect, you won't need a security model at all. Best is relative to the needs of a specific application.

If you tell us what you are trying to obtain, we'll try to help you get it.

Thanks
Laurentiu

|||

Thanks Laurentiu,

The database contains billing information and server will be (in some cases) visible on the Internet. Users manage the data using client application software. It is this software that uses the stored proc API.

I created a specific login known to the software and proceeded to grant execute permission to the API on this login.

The API is quite extensive and while doing this I wondered if there was a better way, perhaps using roles or whatever, so that I can grant permissions to the API once and then allow different logins, including SSPI logins to be members of that role, or group or whatever it should be.

Looking for pointers and advice on the best direction to go on this.

|||

I still need more details about your users: are they having diferent roles, which would require different access restrictions to your data? If they all have the same level of access, do you need to distinguish between them, would you want to know, for example, who did what operation and have an auditing system?

There are many ways to achieve security, and it is hard to tell what is the best solution for you. Here are some possibilities, but I can't really recommend one without knowing more about what you are trying to do.

1) You can have all users connecting to your application with the same credentials, and the software will connect to the server as some login. This won't allow you to know who does what, it will only restrict access to those that know how to connect to the application.

2) You can have users connecting to your application with distinct credentials. You would manage these credentials within your application. The connections to the server would be done using the same login, but your application can implement custom auditing because it does the user authentication. Also, any access rights will have to be controlled at the application layer.

3) You can have users connecting again with distinct credentials, but in this case the credentials correspond to SQL Server logins, and for each user, you connect to SQL Server using the corresponding login. You can do auditing in this case either at the application level or at the server level, within the stored procedures that you call. You can manage rights granted to users at SQL Server level and you can use roles for easier management.

If you don't know exactly what you will want to do, it is a good idea to keep your options open. So, use roles and grant permissions on roles; then, if you need to have those permissions accessible to more than one user, you can just add them to the role.

Hope this helps.

Laurentiu

|||

Laurentiu

Been playing around with it and option 3 seems to suit us best. Keeps it flexible, for example, it allows us to split the API into different roles.

Thanks for your ideas!

Apply or OK buttons not working after client - server installtion

Hi hoping someone can help me...
We Were running a stand alone version of RS fine on a laptop.
the IT guys have completed a server installtion of RS. when I get to Report
Manager home page and click on a report view the properties , and change the
name of the report. I click Apply and nothing happens at all... it just sits
there' if I refresh the page the original name of the report is loaded up.
This is happening with all the apply and ok buttons in reportmanager. but
the reports that are deployed do run and work fine. As none of the buttons
work I cannot give peopla access to view the report' no can I set or apply
any execution properties....
Help gretly appreciated!!
--
Technical Data Analyst
Hammersmith & W London CollegeDo you have appropriate permissions to do the things you wish?
--
Wayne Snyder, MCDBA, SQL Server MVP
Mariner, Charlotte, NC
www.mariner-usa.com
(Please respond only to the newsgroups.)
I support the Professional Association of SQL Server (PASS) and it's
community of SQL Server professionals.
www.sqlpass.org
"Munish RaasmaanDaas" <MunishRaasmaanDaas@.hotmail.com> wrote in message
news:6548A2C9-4345-4C64-BBDD-C895D977BB8C@.microsoft.com...
> Hi hoping someone can help me...
> We Were running a stand alone version of RS fine on a laptop.
> the IT guys have completed a server installtion of RS. when I get to
Report
> Manager home page and click on a report view the properties , and change
the
> name of the report. I click Apply and nothing happens at all... it just
sits
> there' if I refresh the page the original name of the report is loaded
up.
> This is happening with all the apply and ok buttons in reportmanager. but
> the reports that are deployed do run and work fine. As none of the buttons
> work I cannot give peopla access to view the report' no can I set or
apply
> any execution properties....
> Help gretly appreciated!!
> --
> Technical Data Analyst
> Hammersmith & W London College|||Yes I have dbo permission on the databases and have Bultin\admin Contents
manager. Even the OK button on the Assign New Role Screen doesnt work so I
cant add any new users.
Help greatly appreciated...
"Wayne Snyder" wrote:
> Do you have appropriate permissions to do the things you wish?
> --
> Wayne Snyder, MCDBA, SQL Server MVP
> Mariner, Charlotte, NC
> www.mariner-usa.com
> (Please respond only to the newsgroups.)
> I support the Professional Association of SQL Server (PASS) and it's
> community of SQL Server professionals.
> www.sqlpass.org
> "Munish RaasmaanDaas" <MunishRaasmaanDaas@.hotmail.com> wrote in message
> news:6548A2C9-4345-4C64-BBDD-C895D977BB8C@.microsoft.com...
> > Hi hoping someone can help me...
> > We Were running a stand alone version of RS fine on a laptop.
> > the IT guys have completed a server installtion of RS. when I get to
> Report
> > Manager home page and click on a report view the properties , and change
> the
> > name of the report. I click Apply and nothing happens at all... it just
> sits
> > there' if I refresh the page the original name of the report is loaded
> up.
> > This is happening with all the apply and ok buttons in reportmanager. but
> > the reports that are deployed do run and work fine. As none of the buttons
> > work I cannot give peopla access to view the report' no can I set or
> apply
> > any execution properties....
> > Help gretly appreciated!!
> > --
> > Technical Data Analyst
> > Hammersmith & W London College
>
>|||Did you enable Acitve Scripting?
"Munish RaasmaanDaas" wrote:
> Yes I have dbo permission on the databases and have Bultin\admin Contents
> manager. Even the OK button on the Assign New Role Screen doesnt work so I
> cant add any new users.
> Help greatly appreciated...
>
> "Wayne Snyder" wrote:
> > Do you have appropriate permissions to do the things you wish?
> >
> > --
> > Wayne Snyder, MCDBA, SQL Server MVP
> > Mariner, Charlotte, NC
> > www.mariner-usa.com
> > (Please respond only to the newsgroups.)
> >
> > I support the Professional Association of SQL Server (PASS) and it's
> > community of SQL Server professionals.
> > www.sqlpass.org
> >
> > "Munish RaasmaanDaas" <MunishRaasmaanDaas@.hotmail.com> wrote in message
> > news:6548A2C9-4345-4C64-BBDD-C895D977BB8C@.microsoft.com...
> > > Hi hoping someone can help me...
> > > We Were running a stand alone version of RS fine on a laptop.
> > > the IT guys have completed a server installtion of RS. when I get to
> > Report
> > > Manager home page and click on a report view the properties , and change
> > the
> > > name of the report. I click Apply and nothing happens at all... it just
> > sits
> > > there' if I refresh the page the original name of the report is loaded
> > up.
> > > This is happening with all the apply and ok buttons in reportmanager. but
> > > the reports that are deployed do run and work fine. As none of the buttons
> > > work I cannot give peopla access to view the report' no can I set or
> > apply
> > > any execution properties....
> > > Help gretly appreciated!!
> > > --
> > > Technical Data Analyst
> > > Hammersmith & W London College
> >
> >
> >|||Do I enable the Active scritpting on the server or the Client'
Thanks Bjorn
"Björn Grünberg" wrote:
> Did you enable Acitve Scripting?
> "Munish RaasmaanDaas" wrote:
> > Yes I have dbo permission on the databases and have Bultin\admin Contents
> > manager. Even the OK button on the Assign New Role Screen doesnt work so I
> > cant add any new users.
> > Help greatly appreciated...
> >
> >
> >
> > "Wayne Snyder" wrote:
> >
> > > Do you have appropriate permissions to do the things you wish?
> > >
> > > --
> > > Wayne Snyder, MCDBA, SQL Server MVP
> > > Mariner, Charlotte, NC
> > > www.mariner-usa.com
> > > (Please respond only to the newsgroups.)
> > >
> > > I support the Professional Association of SQL Server (PASS) and it's
> > > community of SQL Server professionals.
> > > www.sqlpass.org
> > >
> > > "Munish RaasmaanDaas" <MunishRaasmaanDaas@.hotmail.com> wrote in message
> > > news:6548A2C9-4345-4C64-BBDD-C895D977BB8C@.microsoft.com...
> > > > Hi hoping someone can help me...
> > > > We Were running a stand alone version of RS fine on a laptop.
> > > > the IT guys have completed a server installtion of RS. when I get to
> > > Report
> > > > Manager home page and click on a report view the properties , and change
> > > the
> > > > name of the report. I click Apply and nothing happens at all... it just
> > > sits
> > > > there' if I refresh the page the original name of the report is loaded
> > > up.
> > > > This is happening with all the apply and ok buttons in reportmanager. but
> > > > the reports that are deployed do run and work fine. As none of the buttons
> > > > work I cannot give peopla access to view the report' no can I set or
> > > apply
> > > > any execution properties....
> > > > Help gretly appreciated!!
> > > > --
> > > > Technical Data Analyst
> > > > Hammersmith & W London College
> > >
> > >
> > >|||The client is already set to enable active scripting'
"Munish RaasmaanDaas" wrote:
> Do I enable the Active scritpting on the server or the Client'
> Thanks Bjorn
> "Björn Grünberg" wrote:
> > Did you enable Acitve Scripting?
> >
> > "Munish RaasmaanDaas" wrote:
> >
> > > Yes I have dbo permission on the databases and have Bultin\admin Contents
> > > manager. Even the OK button on the Assign New Role Screen doesnt work so I
> > > cant add any new users.
> > > Help greatly appreciated...
> > >
> > >
> > >
> > > "Wayne Snyder" wrote:
> > >
> > > > Do you have appropriate permissions to do the things you wish?
> > > >
> > > > --
> > > > Wayne Snyder, MCDBA, SQL Server MVP
> > > > Mariner, Charlotte, NC
> > > > www.mariner-usa.com
> > > > (Please respond only to the newsgroups.)
> > > >
> > > > I support the Professional Association of SQL Server (PASS) and it's
> > > > community of SQL Server professionals.
> > > > www.sqlpass.org
> > > >
> > > > "Munish RaasmaanDaas" <MunishRaasmaanDaas@.hotmail.com> wrote in message
> > > > news:6548A2C9-4345-4C64-BBDD-C895D977BB8C@.microsoft.com...
> > > > > Hi hoping someone can help me...
> > > > > We Were running a stand alone version of RS fine on a laptop.
> > > > > the IT guys have completed a server installtion of RS. when I get to
> > > > Report
> > > > > Manager home page and click on a report view the properties , and change
> > > > the
> > > > > name of the report. I click Apply and nothing happens at all... it just
> > > > sits
> > > > > there' if I refresh the page the original name of the report is loaded
> > > > up.
> > > > > This is happening with all the apply and ok buttons in reportmanager. but
> > > > > the reports that are deployed do run and work fine. As none of the buttons
> > > > > work I cannot give peopla access to view the report' no can I set or
> > > > apply
> > > > > any execution properties....
> > > > > Help gretly appreciated!!
> > > > > --
> > > > > Technical Data Analyst
> > > > > Hammersmith & W London College
> > > >
> > > >
> > > >|||You have to enable Active Scripting in the Browser, you use to work with the
Report Manager (Internet Options - Security - Custom Level - Scripting:
Active Scripting Enable)
I got the same problem like you, and solved it by this.
Good luck!!!
Björn
"Munish RaasmaanDaas" wrote:
> The client is already set to enable active scripting'
> "Munish RaasmaanDaas" wrote:
> > Do I enable the Active scritpting on the server or the Client'
> > Thanks Bjorn
> >
> > "Björn Grünberg" wrote:
> >
> > > Did you enable Acitve Scripting?
> > >
> > > "Munish RaasmaanDaas" wrote:
> > >
> > > > Yes I have dbo permission on the databases and have Bultin\admin Contents
> > > > manager. Even the OK button on the Assign New Role Screen doesnt work so I
> > > > cant add any new users.
> > > > Help greatly appreciated...
> > > >
> > > >
> > > >
> > > > "Wayne Snyder" wrote:
> > > >
> > > > > Do you have appropriate permissions to do the things you wish?
> > > > >
> > > > > --
> > > > > Wayne Snyder, MCDBA, SQL Server MVP
> > > > > Mariner, Charlotte, NC
> > > > > www.mariner-usa.com
> > > > > (Please respond only to the newsgroups.)
> > > > >
> > > > > I support the Professional Association of SQL Server (PASS) and it's
> > > > > community of SQL Server professionals.
> > > > > www.sqlpass.org
> > > > >
> > > > > "Munish RaasmaanDaas" <MunishRaasmaanDaas@.hotmail.com> wrote in message
> > > > > news:6548A2C9-4345-4C64-BBDD-C895D977BB8C@.microsoft.com...
> > > > > > Hi hoping someone can help me...
> > > > > > We Were running a stand alone version of RS fine on a laptop.
> > > > > > the IT guys have completed a server installtion of RS. when I get to
> > > > > Report
> > > > > > Manager home page and click on a report view the properties , and change
> > > > > the
> > > > > > name of the report. I click Apply and nothing happens at all... it just
> > > > > sits
> > > > > > there' if I refresh the page the original name of the report is loaded
> > > > > up.
> > > > > > This is happening with all the apply and ok buttons in reportmanager. but
> > > > > > the reports that are deployed do run and work fine. As none of the buttons
> > > > > > work I cannot give peopla access to view the report' no can I set or
> > > > > apply
> > > > > > any execution properties....
> > > > > > Help gretly appreciated!!
> > > > > > --
> > > > > > Technical Data Analyst
> > > > > > Hammersmith & W London College
> > > > >
> > > > >
> > > > >|||Munish, did you find a solution? I have the same problem and I have active
scripting enabled.
Thanks
Dave
"Munish RaasmaanDaas" wrote:
> Hi hoping someone can help me...
> We Were running a stand alone version of RS fine on a laptop.
> the IT guys have completed a server installtion of RS. when I get to Report
> Manager home page and click on a report view the properties , and change the
> name of the report. I click Apply and nothing happens at all... it just sits
> there' if I refresh the page the original name of the report is loaded up.
> This is happening with all the apply and ok buttons in reportmanager. but
> the reports that are deployed do run and work fine. As none of the buttons
> work I cannot give peopla access to view the report' no can I set or apply
> any execution properties....
> Help gretly appreciated!!
> --
> Technical Data Analyst
> Hammersmith & W London College

Friday, February 24, 2012

Appending Stored Procedure

I need to make sure I'm doing this correctly can you help me out guys please?? This is an Appending Stored procedure it should move values from the EmployeeGamingLicense table when the status is turned into TERMINATED to the GCEmployeeTerms table. Heres what I have so far, having problems with the rest of the script getting errors

CREATE PROCEDURE [insert_GCEmployeeTerms_1]
(@.TM_#_1 [int],
@.FirstName_2 [nvarchar](50),
@.LastName_3 [nvarchar](50),
@.SocialSecurityNumber_4 [int],
@.DateHired_5 [datetime],
@.Status_6 [nvarchar](50),
@.TerminationDate_7 [datetime],
@.Title_8 [nvarchar](50),
@.DepartmentName_9 [nvarchar](50),
@.TermReason_10 [ntext],
@.VoluntaryInvoluntary_11 [nvarchar](50))

AS INSERT INTO [CommissionEmployee_Exclusionsdb].[dbo].[GCEmployeeTerms]
( [TM #],
[FirstName],
[LastName],
[SocialSecurityNumber],
[DateHired],
[Status],
[TerminationDate],
[Title],
[DepartmentName],
[TermReason],
[VoluntaryInvoluntary])

SELECT
( @.TM_#,
@.FirstName,
@.LastName,
@.SocialSecurityNumber,
@.DateHired,
@.Status,
@.TerminationDate,
@.Title,
@.DepartmentName,
@.TermReason,
@.VoluntaryInvoluntary)
FROM EmployeeGamingLicense
WHERE STATUS = 'TERMINATED'
GOMethinks thou wants not both tables and variables in yon SELECT list. There be dragons in that mix!

-PatP|||Thank You pat for that very old English/shakesperan/King Arthur answer..lol. Let me see if I interpreted that correctly...Your saying that my select list is incorrect, refering of both tables...I need to go look it up..sorry this is my first append stored procedure.|||If you stop and think about it, you are passing a whole flock (that would be a really technical term, eh?) of parameters to the procedure. This would be a good thing if you wanted to insert those parameters one time. You are also using a table with a where clause, which would be a good thing if you wanted to insert columns from that table for every row that qualified. However, doing both means that you want to insert one copy of the parameters that you are passing for every row that qualifies... Which is probably NOT what you intended!

You probably want to either use the table with column names, or the parameters without the table, but only one or the other!

-PatP|||I'm just being nosy now, but did you ever get this one sorted out to your satisfaction?

-PatP|||Sniff Sniff Sniff ...No :(

I just want to insert data from one table to another automatically through stored procedures. I can make part of the procedure (that Inserts), need to tell it to pull from another table...I dont know, keep getting errors its like a puzzle I can only get part of it but not all (Banging head against wall)|||a whole flock of parameters

Pat. I thought it was a bevy of parameters. Hmm. May need to add it to the lists...

http://encyclopedia.thefreedictionary.com/List%20of%20collective%20nouns%20by%20collective%2 0term%20A-K|||ok so your saying its just a bunch of parameters..a horde of parameters|||Sniff Sniff Sniff ...No :(Now, now M'Dear! You musn't get too upset. Might I suggest trying something like:CREATE PROCEDURE [insert_GCEmployeeTerms_1]
AS

INSERT INTO [CommissionEmployee_Exclusionsdb].[dbo].[GCEmployeeTerms] (
[TM #], [FirstName], [LastName]
, [SocialSecurityNumber], [DateHired], [Status]
, [TerminationDate], [Title], [DepartmentName]
, [TermReason], [VoluntaryInvoluntary]
) SELECT
a.TM_#, a.FirstName, a.LastName
, a.SocialSecurityNumber, a.DateHired, a.Status
, a.TerminationDate, a.Title, a.DepartmentName
, a.TermReason, a.VoluntaryInvoluntary)
FROM EmployeeGamingLicense AS a
WHERE a.STATUS = 'TERMINATED'

RETURN
GO-PatP|||I did what you said pat I got some Errors but what you put up gave me a great start I was not Referring to the Table.Fields that will do the inserting for me Fields in that Table are not exactly the same..(the column names) OK I GET IT A LIGHT HAS GONE OFF AND I TOTALLY GET IT NOW..Now I see why SQL was like WHAT THE HELL ARE YOU TALKING ABOUT..OMG I Die a Happy Woman now...Thank you so much Pat Your Truly a SQL GOD..I totally Get it now. :)

THANK YOU|||and Mccowley was correct I see his point now|||Now that you grok (http://searchsmallbizit.techtarget.com/sDefinition/0,,sid44_gci212216,00.html) how this works, you'll need to think a bit about how to limit the carnage. You won't always want to turn that procedure loose to copy every terminated employee row, or you'll end up with a gazillion copies of each row in GCEmployeeTerms (one for every time the procedure is run while that terminated row is present), which probably isn't what you want!

-PatP|||Ok...what happens is our dept gets a list of terminated employees, so when we get this list we go to the [GamingCommissiondb].[dbo].[EmployeeGamingLicense] which is the active table look up the employee and change their status to 'TERMINATED'. Now there can be as much as 20-30 terminations. The Stored Procedure will then take those employees that have been changed to Terminated and APPEND (Insert) them into the TERMINATION.tbl. One of the purposes of doing this is if the employee comes back as a rehire we can check to see if there was any negative action taken against their gaming license. Does that make sense??|||Oh heavens yes, but unless you delete the row from the active table when this happens, that row will be copied every time the procedure runs (at least as it is written right now). The only criteria that you are using to select rows is if they are flagged as terminated... You might want to be a wee bit more selective than that, maybe throwing in a test on the termination date or something wild and crazy like that!

-PatP|||I understand what your saying, I dont want my users to delete a record in error so I didnt want to delet the record from the active table through the stored procedure. I have a view that shows only the terminated employees from the active table that I look at everyone 2 weeks, just a QC thing. also both the tables have the TM# as a primary key, the first time I tried to execute the stored procedure I got "Pk constraint error duplicates"..something like that, so once I found out what records had already been appended to the TERMINATION table and which ones had not been the SP executed successfully. Does that make sense?|||Yep, that it does. You could also use a WHERE NOT EXISTS clause to test for the presence of the TM# while you were doing the INSERT operation. This might be simpler.

-PatP|||Yes your absolutely right, I'll alter the procedure then thank you again for your help Pat I sure do appreciate it :)

Sunday, February 19, 2012

Append to a field in a database

Hey guys,
I am working on some ASP stuff, and I have an html table which displays records from a SQL table.There is one field for adding notes which i want to append to. Basically I have a textbox, in which if a user enters information, it should be appended to a field (rNotes) in my database table. Right now all i can get it to do is overwrite the current information as i'm using an UPDATE query. Any ideas?In update statement you should use:

Update table_name set
column_name = column_name + new_data
where ....|||

Quote:

Originally Posted by nikhil86

Hey guys,
I am working on some ASP stuff, and I have an html table which displays records from a SQL table.There is one field for adding notes which i want to append to. Basically I have a textbox, in which if a user enters information, it should be appended to a field (rNotes) in my database table. Right now all i can get it to do is overwrite the current information as i'm using an UPDATE query. Any ideas?


Personally, I would create a notes table, and add records. This way, you can also store things like date/time, user id, etc. You can then just display them in sequential order. I do this with ASP all the time. Just did it yesterday, as a matter of fact. The client loved it.

Good luck,

Michael C. Gates

Monday, February 13, 2012

Apparent System hang during intensive long running script: SQL Server 2000 Best Server Set

Hi,
Our db server appeared to hang during a long running script last
night. Our windows guys rebooted the server. However my suspicion is
that the script was consuming all systems resources but was still
running.
The SQL server is set to use all CPUs (4) and max all memory. It also
has raised priority on Windows. Does this sound like a possibility? If
so, what are the best settings for a db server, just so that we can
keep access during intensive queries?
When the server came back up, sql spent a great deal of time
recovering this database, presumably because it was mid-transaction.
Cheers,
James
MCDBAYou might want to limit memory a bit (sp_configure, max server memory), definitely remove higher
priority. For the batch, consider using MAXDOP options for your queries, leaving some CPUs to other
SQL users. But also look at how the batch is written, as part of the problem might have been
blocking. For instance, perhaps you can split up some modifications over several transactions
instead of one big truncations.
--
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
"JimLad" <jamesdbirch@.yahoo.co.uk> wrote in message
news:1175162797.522975.64110@.p77g2000hsh.googlegroups.com...
> Hi,
> Our db server appeared to hang during a long running script last
> night. Our windows guys rebooted the server. However my suspicion is
> that the script was consuming all systems resources but was still
> running.
> The SQL server is set to use all CPUs (4) and max all memory. It also
> has raised priority on Windows. Does this sound like a possibility? If
> so, what are the best settings for a db server, just so that we can
> keep access during intensive queries?
> When the server came back up, sql spent a great deal of time
> recovering this database, presumably because it was mid-transaction.
> Cheers,
> James
> MCDBA
>|||On Mar 29, 11:20 am, "Tibor Karaszi"
<tibor_please.no.email_kara...@.hotmail.nomail.com> wrote:
> You might want to limit memory a bit (sp_configure, max server memory), definitely remove higher
> priority. For the batch, consider using MAXDOP options for your queries, leaving some CPUs to other
> SQL users. But also look at how the batch is written, as part of the problem might have been
> blocking. For instance, perhaps you can split up some modifications over several transactions
> instead of one big truncations.
> --
> Tibor Karaszi, SQL Server MVPhttp://www.karaszi.com/sqlserver/default.asphttp://www.solidqualitylearning.com/
> "JimLad" <jamesdbi...@.yahoo.co.uk> wrote in message
> news:1175162797.522975.64110@.p77g2000hsh.googlegroups.com...
>
> > Hi,
> > Our db server appeared to hang during a long running script last
> > night. Our windows guys rebooted the server. However my suspicion is
> > that the script was consuming all systems resources but was still
> > running.
> > The SQL server is set to use all CPUs (4) and max all memory. It also
> > has raised priority on Windows. Does this sound like a possibility? If
> > so, what are the best settings for a db server, just so that we can
> > keep access during intensive queries?
> > When the server came back up, sql spent a great deal of time
> > recovering this database, presumably because it was mid-transaction.
> > Cheers,
> > James
> > MCDBA- Hide quoted text -
> - Show quoted text -
Thanks Tibor, A great help. On the windows priority, is that a well
known no-no? What problems does it cause?
Yes the script was a pile of erm ...
I wanted to add a new column but not as the last column so I scripted
out of EM. So million of records into a temp table, rename etc,
reindex etc etc. AND I put a transaction round the whole thing... Not
clever. Plus transaction backups were happening at the same time...
So I'm rewriting to simply add a NULLABLE column to the end of the
table using ALTER TABLE and removing the wrapping transaction. Should
be a lot faster...
And I thought I was a reasonable DBA... :o(
When doing my testing I was on the dev server using Simple Recovery
model so didn't hit the same problems.
Many thanks,
James|||On Mar 29, 12:56 pm, "JimLad" <jamesdbi...@.yahoo.co.uk> wrote:
> On Mar 29, 11:20 am, "Tibor Karaszi"
>
>
> <tibor_please.no.email_kara...@.hotmail.nomail.com> wrote:
> > You might want to limit memory a bit (sp_configure, max server memory), definitely remove higher
> > priority. For the batch, consider using MAXDOP options for your queries, leaving some CPUs to other
> > SQL users. But also look at how the batch is written, as part of the problem might have been
> > blocking. For instance, perhaps you can split up some modifications over several transactions
> > instead of one big truncations.
> > --
> > Tibor Karaszi, SQL Server MVPhttp://www.karaszi.com/sqlserver/default.asphttp://www.solidqualityle...
> > "JimLad" <jamesdbi...@.yahoo.co.uk> wrote in message
> >news:1175162797.522975.64110@.p77g2000hsh.googlegroups.com...
> > > Hi,
> > > Our db server appeared to hang during a long running script last
> > > night. Our windows guys rebooted the server. However my suspicion is
> > > that the script was consuming all systems resources but was still
> > > running.
> > > The SQL server is set to use all CPUs (4) and max all memory. It also
> > > has raised priority on Windows. Does this sound like a possibility? If
> > > so, what are the best settings for a db server, just so that we can
> > > keep access during intensive queries?
> > > When the server came back up, sql spent a great deal of time
> > > recovering this database, presumably because it was mid-transaction.
> > > Cheers,
> > > James
> > > MCDBA- Hide quoted text -
> > - Show quoted text -
> Thanks Tibor, A great help. On the windows priority, is that a well
> known no-no? What problems does it cause?
> Yes the script was a pile of erm ...
> I wanted to add a new column but not as the last column so I scripted
> out of EM. So million of records into a temp table, rename etc,
> reindex etc etc. AND I put a transaction round the whole thing... Not
> clever. Plus transaction backups were happening at the same time...
> So I'm rewriting to simply add a NULLABLE column to the end of the
> table using ALTER TABLE and removing the wrapping transaction. Should
> be a lot faster...
> And I thought I was a reasonable DBA... :o(
> When doing my testing I was on the dev server using Simple Recovery
> model so didn't hit the same problems.
> Many thanks,
> James- Hide quoted text -
> - Show quoted text -
Is it bad practice to temporarily switch to SIMPLE recovery mode when
doing these kinds of operations? Everyone will be off the database at
the time... It's taking forever!|||> Thanks Tibor, A great help. On the windows priority, is that a well
> known no-no? What problems does it cause?
I believe it is still considered a no-no. I recall at some version that a combination of settings
would lift SQL Server to real-time priority. I don't think it can happen anymore, but in some
extreme cases, you might starve OS. Also, if you don't have anything else on the machine, then what
would SQL Server be higher than? I suggest you Google to see if you can find more official words on
this.
> I wanted to add a new column but not as the last column so I scripted
> out of EM.
Ugh. Yes, I can see why that happened. We all learn from our mistakes. :-)
--
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
"JimLad" <jamesdbirch@.yahoo.co.uk> wrote in message
news:1175169365.880312.134100@.y80g2000hsf.googlegroups.com...
> On Mar 29, 11:20 am, "Tibor Karaszi"
> <tibor_please.no.email_kara...@.hotmail.nomail.com> wrote:
>> You might want to limit memory a bit (sp_configure, max server memory), definitely remove higher
>> priority. For the batch, consider using MAXDOP options for your queries, leaving some CPUs to
>> other
>> SQL users. But also look at how the batch is written, as part of the problem might have been
>> blocking. For instance, perhaps you can split up some modifications over several transactions
>> instead of one big truncations.
>> --
>> Tibor Karaszi, SQL Server
>> MVPhttp://www.karaszi.com/sqlserver/default.asphttp://www.solidqualitylearning.com/
>> "JimLad" <jamesdbi...@.yahoo.co.uk> wrote in message
>> news:1175162797.522975.64110@.p77g2000hsh.googlegroups.com...
>>
>> > Hi,
>> > Our db server appeared to hang during a long running script last
>> > night. Our windows guys rebooted the server. However my suspicion is
>> > that the script was consuming all systems resources but was still
>> > running.
>> > The SQL server is set to use all CPUs (4) and max all memory. It also
>> > has raised priority on Windows. Does this sound like a possibility? If
>> > so, what are the best settings for a db server, just so that we can
>> > keep access during intensive queries?
>> > When the server came back up, sql spent a great deal of time
>> > recovering this database, presumably because it was mid-transaction.
>> > Cheers,
>> > James
>> > MCDBA- Hide quoted text -
>> - Show quoted text -
> Thanks Tibor, A great help. On the windows priority, is that a well
> known no-no? What problems does it cause?
> Yes the script was a pile of erm ...
> I wanted to add a new column but not as the last column so I scripted
> out of EM. So million of records into a temp table, rename etc,
> reindex etc etc. AND I put a transaction round the whole thing... Not
> clever. Plus transaction backups were happening at the same time...
> So I'm rewriting to simply add a NULLABLE column to the end of the
> table using ALTER TABLE and removing the wrapping transaction. Should
> be a lot faster...
> And I thought I was a reasonable DBA... :o(
> When doing my testing I was on the dev server using Simple Recovery
> model so didn't hit the same problems.
> Many thanks,
> James
>|||> Is it bad practice to temporarily switch to SIMPLE recovery mode when
> doing these kinds of operations?
That depends on that backups you do for the database. Switching to simple will break the log backup
sequence, so talk to the backup administrator first!
--
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
"JimLad" <jamesdbirch@.yahoo.co.uk> wrote in message
news:1175178098.755362.269210@.b75g2000hsg.googlegroups.com...
> On Mar 29, 12:56 pm, "JimLad" <jamesdbi...@.yahoo.co.uk> wrote:
>> On Mar 29, 11:20 am, "Tibor Karaszi"
>>
>>
>> <tibor_please.no.email_kara...@.hotmail.nomail.com> wrote:
>> > You might want to limit memory a bit (sp_configure, max server memory), definitely remove
>> > higher
>> > priority. For the batch, consider using MAXDOP options for your queries, leaving some CPUs to
>> > other
>> > SQL users. But also look at how the batch is written, as part of the problem might have been
>> > blocking. For instance, perhaps you can split up some modifications over several transactions
>> > instead of one big truncations.
>> > --
>> > Tibor Karaszi, SQL Server
>> > MVPhttp://www.karaszi.com/sqlserver/default.asphttp://www.solidqualityle...
>> > "JimLad" <jamesdbi...@.yahoo.co.uk> wrote in message
>> >news:1175162797.522975.64110@.p77g2000hsh.googlegroups.com...
>> > > Hi,
>> > > Our db server appeared to hang during a long running script last
>> > > night. Our windows guys rebooted the server. However my suspicion is
>> > > that the script was consuming all systems resources but was still
>> > > running.
>> > > The SQL server is set to use all CPUs (4) and max all memory. It also
>> > > has raised priority on Windows. Does this sound like a possibility? If
>> > > so, what are the best settings for a db server, just so that we can
>> > > keep access during intensive queries?
>> > > When the server came back up, sql spent a great deal of time
>> > > recovering this database, presumably because it was mid-transaction.
>> > > Cheers,
>> > > James
>> > > MCDBA- Hide quoted text -
>> > - Show quoted text -
>> Thanks Tibor, A great help. On the windows priority, is that a well
>> known no-no? What problems does it cause?
>> Yes the script was a pile of erm ...
>> I wanted to add a new column but not as the last column so I scripted
>> out of EM. So million of records into a temp table, rename etc,
>> reindex etc etc. AND I put a transaction round the whole thing... Not
>> clever. Plus transaction backups were happening at the same time...
>> So I'm rewriting to simply add a NULLABLE column to the end of the
>> table using ALTER TABLE and removing the wrapping transaction. Should
>> be a lot faster...
>> And I thought I was a reasonable DBA... :o(
>> When doing my testing I was on the dev server using Simple Recovery
>> model so didn't hit the same problems.
>> Many thanks,
>> James- Hide quoted text -
>> - Show quoted text -
> Is it bad practice to temporarily switch to SIMPLE recovery mode when
> doing these kinds of operations? Everyone will be off the database at
> the time... It's taking forever!
>|||On Mar 29, 4:39 pm, "Tibor Karaszi"
<tibor_please.no.email_kara...@.hotmail.nomail.com> wrote:
> > Is it bad practice to temporarily switch to SIMPLE recovery mode when
> > doing these kinds of operations?
> That depends on that backups you do for the database. Switching to simple will break the log backup
> sequence, so talk to the backup administrator first!
> --
> Tibor Karaszi, SQL Server MVPhttp://www.karaszi.com/sqlserver/default.asphttp://www.solidqualitylearning.com/
> "JimLad" <jamesdbi...@.yahoo.co.uk> wrote in message
> news:1175178098.755362.269210@.b75g2000hsg.googlegroups.com...
>
> > On Mar 29, 12:56 pm, "JimLad" <jamesdbi...@.yahoo.co.uk> wrote:
> >> On Mar 29, 11:20 am, "Tibor Karaszi"
> >> <tibor_please.no.email_kara...@.hotmail.nomail.com> wrote:
> >> > You might want to limit memory a bit (sp_configure, max server memory), definitely remove
> >> > higher
> >> > priority. For the batch, consider using MAXDOP options for your queries, leaving some CPUs to
> >> > other
> >> > SQL users. But also look at how the batch is written, as part of the problem might have been
> >> > blocking. For instance, perhaps you can split up some modifications over several transactions
> >> > instead of one big truncations.
> >> > --
> >> > Tibor Karaszi, SQL Server
> >> > MVPhttp://www.karaszi.com/sqlserver/default.asphttp://www.solidqualityle...
> >> > "JimLad" <jamesdbi...@.yahoo.co.uk> wrote in message
> >> >news:1175162797.522975.64110@.p77g2000hsh.googlegroups.com...
> >> > > Hi,
> >> > > Our db server appeared to hang during a long running script last
> >> > > night. Our windows guys rebooted the server. However my suspicion is
> >> > > that the script was consuming all systems resources but was still
> >> > > running.
> >> > > The SQL server is set to use all CPUs (4) and max all memory. It also
> >> > > has raised priority on Windows. Does this sound like a possibility? If
> >> > > so, what are the best settings for a db server, just so that we can
> >> > > keep access during intensive queries?
> >> > > When the server came back up, sql spent a great deal of time
> >> > > recovering this database, presumably because it was mid-transaction.
> >> > > Cheers,
> >> > > James
> >> > > MCDBA- Hide quoted text -
> >> > - Show quoted text -
> >> Thanks Tibor, A great help. On the windows priority, is that a well
> >> known no-no? What problems does it cause?
> >> Yes the script was a pile of erm ...
> >> I wanted to add a new column but not as the last column so I scripted
> >> out of EM. So million of records into a temp table, rename etc,
> >> reindex etc etc. AND I put a transaction round the whole thing... Not
> >> clever. Plus transaction backups were happening at the same time...
> >> So I'm rewriting to simply add a NULLABLE column to the end of the
> >> table using ALTER TABLE and removing the wrapping transaction. Should
> >> be a lot faster...
> >> And I thought I was a reasonable DBA... :o(
> >> When doing my testing I was on the dev server using Simple Recovery
> >> model so didn't hit the same problems.
> >> Many thanks,
> >> James- Hide quoted text -
> >> - Show quoted text -
> > Is it bad practice to temporarily switch to SIMPLE recovery mode when
> > doing these kinds of operations? Everyone will be off the database at
> > the time... It's taking forever!- Hide quoted text -
> - Show quoted text -
So, in the end I did the following:
This solution relies on single user access to database, i.e kicking
everybody off. Also it relies on changing recovery model and is not
transaction safe. It is for large multi -million row updates. Smaller
updates don't need this stuff.
Stop log backups
Take FULL BACKUP
Script:
ALTER DATABASE audit_trail SET RECOVERY SIMPLE -- transaction log gets
truncated after transactions have completed. Stops log growing.
GO
-- Add the extra column as nullable
ALTER TABLE dbo.change_keys
ADD ordinal_position tinyint NULL -- NULLABLE means v little cost.
GO
-- Run big update in batches. test for appropriate batch size. At some
point performance will go bad.
-- i.e. in this case batches of 300000 took a ~1s, batches of 500000
took > 1min.
-- Update all change_keys records to set ordinal_position to 1. Do so
in optimised batches of 300,000 records.
SET NOCOUNT ON
DECLARE @.maxid int
DECLARE @.minid int
DECLARE @.currentid int
DECLARE @.step int
DECLARE @.count int
SET @.maxid = (SELECT MAX(change_id) FROM change_keys)
SET @.minid = (SELECT MIN(change_id) FROM change_keys)
SET @.currentid = @.minid
SET @.step = 300000
SET @.count = 0
WHILE @.currentid <= @.maxid
BEGIN
UPDATE change_keys WITH (TABLOCK) -- tablock useful as only user on
database - stops page locks having to escalate.
SET ordinal_position = 1
WHERE change_id BETWEEN @.currentid AND @.currentid + @.step - 1
SET @.count = @.count + @.@.rowcount
PRINT @.count
SET @.currentid = @.currentid + @.step
END
PRINT 'Total updated records: ' + CONVERT(varchar(15), @.count)
GO
-- Now make ordinal_position NOT NULL, if necessary
ALTER TABLE dbo.change_keys
ALTER COLUMN ordinal_position tinyint NOT NULL
GO
-- Put db back into FULL recovery mode.
ALTER DATABASE audit_trail SET RECOVERY FULL
GO
I have simplified from what I needed to do, but hopefully this is
helpful to someone...
Thanks again, Tibor.
Cheers,
James