thank you so much for the response...
just to answer your questions, although i can play with the databases, i cant mess with the apps. that unfortunately is not an option...
but to expand on the trigger, that was the plan...and that query will work with one exception, i do not know how to...
sorry, i meant the end result should be this....
LIST_ID__list_type_____file_number_______owner_id
10001____general_____abc123____________200
10004____follow______abc345____________300
10003____general_____abc345____________300
10003____general_____abc567____________300...
Ok....so i have been beating myself up and i have been crosseyed enough to put my head down and ask for help...it has defeated me and i need some guidance.
here goes:
i have two databases that i would like to update eachother when variables change from one to the other. right now i am going...
thanks for the reply...the query is a bit more complex than the original one you posted...sorry i will try to explain a bit more in depth...
i have customers that have a series of payments that are set up in the system...lets say that a person is making 100.00 payments every week on friday...if...
OK...i have a query i was hoping maybe someone can graciously assist me with...the scenario is this...
I have a table that has customers that make regular payments. for example sake lets say that there are 6 columns like this in the payments table
payment_id customer_id paymentdate...
Hi,
I have an office that has two ISP's, T1 and DSL. I have been confused on how to set up the network for the the managers and for the sales agents.
Here is what i have. the T1 handles all the Voip stuff and some minimal internet usage for the sales reps. the reps have one ethernet line going...
well Travis, that did not shave any time off...still running at 3.5+ minutes...
thanks...
Beuller,Beuller,Beuller...I'm just panhandling here i guess...
thanks for the suggestions in advance..
here it is without the null:
--Top(1000)
|--Stream Aggregate(GROUP BY:([D].[ACCT_NO]) DEFINE:([D].[ZIPCODE]=ANY([D].[ZIPCODE])))
|--Filter(WHERE:([R].[ACCT_NO]=NULL))
|--Merge Join(Left Outer Join, MERGE:([D].[ACCT_NO])=([R].[ACCT_NO])...
Thanks for the reply...
I am trying to filter out all the records in the avoid_areas table by zipcode...
i was using a 'where zipcode not in(select zipcode from avoid_areas)' before but i was trying to speed it up by forcing the index and use a left outer join.
it will work and return the corect...
Can someone please look over this code and tell me if there is anything that they can recommend to speed it up...I have all the qualified fields indexed or clustered and the thing is, it runs fast (<1 sec) if i run it without the last line: "AND A.zipcode is null", otherwise it runs for 3.5...
could anyone please lead me in the right direction to make my query a bit more efficient using joins as opposed to the not exists subquery. the query is below...
INSERT INTO prod_22 (acct_no,zipcode)
SELECT TOP 1000000 acct_no, zipcode
FROM [20054_dup] With (tablockx, holdlock)
WHERE timezone...
in that case is it necessary to even have a clustered index on the table...is it doing more harm than good?
My batches are probably around 10000 each, give or take...they are different every time...
thanks for the replies...
The account numbers in the History table that the data is imported into may already contain the account numbers in there. The table just houses the account numbers and when they were accessed and with which daily batch. They are accessed about every 90 days or more...the data is inserted i the...
Thanks for the suggestion...This is what the query look likes for one of the inserts. there is a batch wrapper for this that goes off the zipcode in 1000 increments to break it up...any suggestions to help optimize this. i greatly appreciate it.
INSERT INTO prod_22 (acct_no,zipcode)
SELECT...
I have some large tables that i have as log/history file entry tables. The tables contain anywhere from 10 million to 50 million records that do not have a unique key as it is not necessary but have a clustered index on the account number. this is not a unique key, but could be if i add the...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.