Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations TouchToneTommy on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Multiple IDs to SP- Please Help finding another way

Status
Not open for further replies.

sonya9879

Programmer
Jun 18, 2004
147
CA
Hi,

I have the following stored procedure:

CREATE PROCEDURE testingprocedure
(
@IDs VarChar(8000),
@DV int
)
AS
Declare @IdNumbers Table(Id Integer)

Insert
Into @IdNumbers(Id)
Select FieldName As Id from dbo.Split(@new_lettersID, ',')

SELECT
ID,
DATANAME,
DV
FROM
table_name
Inner Join @IdNumbers IdNum on Table_Name.Id = IdNum.Id
WHERE
DV = @DV

And then the following function:

CREATE FUNCTION Split(@CommaDelimitedFieldNames Varchar(8000), @Character VarChar(20)) RETURNS @Tbl_FieldNames TABLE (Id Integer IDENTITY (1, 1), FieldName VarChar(100)) AS BEGIN SET @CommaDelimitedFieldNames = @CommaDelimitedFieldNames + @Character DECLARE @Pos1 Int DECLARE @pos2 Int SET @Pos1 = 1 SET @Pos2 = 1 WHILE @Pos1 < Len(@CommaDelimitedFieldNames) BEGIN SET @Pos1 = CharIndex(@Character, @CommaDelimitedFieldNames, @Pos1) INSERT @Tbl_FieldNames SELECT Cast(Substring(@CommaDelimitedFieldNames, @Pos2, @Pos1 - @Pos2) AS VarChar(100)) SET @Pos2 = @Pos1 + 1 SET @Pos1 = @Pos1 + 1 END RETURN END

the way I call the procedure is

testingprocedure '1,2,3,4,5,6,7', 1

This works fine but my problem is that I have several thousand IDs I need to past to the procedure and this method will just not work. Does anyone knows a way around it? Thanks
 
Pass the ids as XML and then you can query the XML directly.
Have a look at OPENXML and if you need more info post again.


"I'm living so far beyond my income that we may almost be said to be living apart
 
thanks hmckillop, so should I have one query that gets all the ids and put that into XML and then use openxml? I am going to take a look at openxml to see what it does.
 
Sorry what I meant was you pass the data to the procedure in XML, and then XML query will allow you directly access the data within the XML passed through.
You wont need to perform expensive string splitting operation.
The one thing to remember with XML though is, ensure that you close it as soon as you finish using it, as it can be memory intensive.


"I'm living so far beyond my income that we may almost be said to be living apart
 
thanks so much hmckillop for your help, i am reading about openxml and will start experiment with it as soon as I have finish reading the essentials.
 
Sonya,

I sincerely apologize. There was a thread going about a week ago, and you asked this question, and I responded, blah, blah blah....

Here's a way to resolve your problem

1) Create this procedure
Code:
ALTER  Procedure dbo.ParseArray(@myText Text, @delimiter varchar(10))  
AS  
-- The array coming in comma delimited.  
-- Each element in the array has a limited size of 8000 bytes  
Set nocount on  
Declare @TextLen int  
Declare @TextSubLen int  
Declare @Element varchar(8000)  
Declare @notDone bit  
Declare @ptrval binary(16)  
Declare @ans int  
Declare @skip int  
Create Table #Result(id_num int IDENTITY(1,1), Element varchar(8000))  
Create Table #TempText(myText Text)  
Insert 	Into #TempText(myText)  
	Values(@myText)  
SELECT @ptrval = TEXTPTR(myText)  
FROM #TempText  
Set @notDone = 1  
Set @delimiter = ltrim(rtrim(@delimiter))  
Set @skip = len(@delimiter)  
While @notDone <> 0  
Begin  
	Select @notDone = Patindex('%' + @delimiter + '%', myText) From #TempText		--Check if last one element left	  
	If(@notDone = 0)							--if so, extract and exit  
	Begin  
		Insert Into #Result(Element)  
		Select myText From #TempText  
		set @notdone = 0  
	End  
	Else  
	Begin  
		Select @TextLen = DataLength(myText) From #TempText			--get length of Text		  
		Select @TextSubLen = (@skip - 1) + Patindex('%' + @delimiter + '%', myText) From #TempText		--get the length of the first element  
		Select @Element = Substring(myText, 1, @TextSubLen - @skip) From #TempText	--extract that element  
		  
		Insert Into #Result(Element) Values(@Element)  
	  
		Set @ans = @TextSubLen  
		UpdateText #TempText.myText @ptrval 0 @ans				--Remove element  
	End  
End  
Select * from #Result

Then, modify your procedure...

Code:
CREATE PROCEDURE testingprocedure
(
    @IDs [red]Text[/red],
    @DV int
)
AS
[red]
SET NOCOUNT ON
Create Table #IdNumbers(RowId Integer, Id Integer)

Insert Into #IdNumbers
Exec ParseArray '1,2,3,4,5,6', ','
[/red]
SELECT       
    ID, 
    DATANAME, 
    DV 
FROM         
    table_name
    Inner Join [red]#IdNumbers[/red] IdNum on Table_Name.Id = IdNum.Id
WHERE
    DV = @DV

Again, my sincerest apologies for not getting back to you in a timely fashion.

-George

Strong and bitter words indicate a weak cause. - Fortune cookie wisdom
 
Ooops

This...

Insert Into #IdNumbers
Exec ParseArray '1,2,3,4,5,6', ','

should have been

Insert Into #IdNumbers
Exec ParseArray @Ids, ','


-George

Strong and bitter words indicate a weak cause. - Fortune cookie wisdom
 
hey i though u abandoned me in this difficult world :)
looking at your procedures. Now I am debating which way to go, I have two potential solutions and which one is better? The easy way for me would be going with your procedure because it is less things to change on my end but in your honest opinion do u think that this is a feasible way to go vs the openxml solution hmckillop mentioned?
 
Sonya,

I can't honestly say. I've never used the OpenXML method. I know that hmckillop is a smart guy that has given a lot of good advice to a lot of people. I respect him.

Unfortunately, I suggest that you experiment both ways. My method does a lot of string manipulation, which is usually fairly slow. However, with my method, you have a ready run sp that you can easily test with a large dataset. Give it a go. See what happens. If the performance is not acceptable, look for an alternative solution. As a simple test, write a SP that accepts a long string, and then parse it out and select the data back. Use 'reasonable' data as input. If you expect to pass 10,000 integer, then test with 10,000 integers. If a million, then a million.

Good luck.

-George

Strong and bitter words indicate a weak cause. - Fortune cookie wisdom
 
As usual you are a STAR :), thanks George for your input and also to hmckillop. I will test your procedure and also will test it against using XML (OpenXML). I will let you know both the outcome when I am done with it.
Again thanks to both of you!
 
Thanks for the comment gmmastros .

I agree, you need to see what works best, the downside of using openxml, is you will need to format the prior to sending in XML format.

Here is a sample of how it would work (only got time to do this morning)

Code:
CREATE PROCEDURE testingprocedure
(
    @XMLIDs VarChar(8000),
    @DV int
)
AS

DECLARE  @v_XMLIDs TABLE(MyID BIGINT PRIMARY KEY)

--Open File handle to read  XML
DECLARE @FileHandle INT -- Filehandle to the xml stuff 

--Execute the inbuilt XML stored proc handler
EXEC dbo.sp_xml_preparedocument @FileHandle output, @XMLIDs 
INSERT INTO @v_XMLIDs (MyID)
SELECT 
	*
FROM
	OPENXML(@FileHandle, 'root/data', 1 ) 
	WITH (	myid       Int         	'@myid')

EXEC dbo.sp_xml_removedocument @FileHandle

SELECT       
    ID, 
    DATANAME, 
    DV 
FROM         
    table_name
    Inner Join @v_XMLIDs IdNum on Table_Name.Id = IdNum.MyId
WHERE
    DV = @DV

/*
--to test proc run this
exec testingprocedure '<root>
	<data myid="10001" />
	<data myid="10002" />
	<data myid="10003" />
</root>', 1 */

Also agree in the testing proposal - ensure you use live size data where possible.


"I'm living so far beyond my income that we may almost be said to be living apart
 
thanks hmckillop this is a lot of useful help and I really appreciate it. I will test it both things with live size data and let you know the results. Have a nice day!
 
I run a small series of tests just out of my own interest in seeing what is best.

I thought string operations where considerably heavier and just wanted to reconfirm this understanding.
So basically I scripted out a medium sized data table so that it contained 89329 rows.

I then had two versions of essentially the same proc, one using XML and one using the function provided by gmmastros
The XML version returned all rows back in 16 seconds.
The string split version is still running at over 11 minutes

Now I would guess there is a threshold where the string operations would execute faster than XML for smaller sets of data, but not sure how to test this without a lot of work.
But in summary, if you plan to operate with a large dataset, and you can add the additional tag information, from a SQL point of view the XML is significantly faster.

If you want a copy of the tests I can post exact procedures and scripts.



"I'm living so far beyond my income that we may almost be said to be living apart
 
String job still running at 20 minutes.

"I'm living so far beyond my income that we may almost be said to be living apart
 
String job still running at 30 minutes, so I wont go on anymore. Anyway, you can see where it is leading.


"I'm living so far beyond my income that we may almost be said to be living apart
 
hi hmckillop, thanks so much for the work u put running those test. I am really sorry that I couldnt; get back to you sooner. I got into a family crisis and been out of the game for several weeks. Now I am back to work and need to catch up really fast. I see that you run the test both test, it would be great to have a copy of the test, it will definelty save me sometime. Really appreciate it and again, please accept my apologize for not getting back to you sooner.
 
No problem, only too glad to help

Just to let you know the the test finished at around 42 min mark.



"I'm living so far beyond my income that we may almost be said to be living apart
 
I think so, but the files are quite big.
I can upload them when i get into work tomorrow.


"I'm living so far beyond my income that we may almost be said to be living apart
 
I am just realising my test wont work with you, as I replaced one of your tables with a table I have (this was used for the join).
Therefore the ids wouldnt match.


Essentially what I did was select a large table in an existing database. Extract a key field of data for each test proc call
i.e.
select convert(varchar, [KeyFieldName]) + ',' from [MyTableName] --this for nonXML test

select '<data myid="' + convert(varchar, [KeyFieldName]) ) + '" />' from [MyTableName] --This for XML test

Then add the necessary slight amendments i.e. add <root> and </root> to before and end of XML string, remove last comma from text string, Add quotes etc.

Then I executed the procs seperately.



"I'm living so far beyond my income that we may almost be said to be living apart
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top