0

We are trying to save result of a query in database by tokenizing a column but query is taking more than 3 minutes to finish his objective. My question is, do we have any faster way to tokenize a column?

Query:

Select *
from
(
        SELECT TblTwo.ID Desc_ID,
           (CASE WHEN TblTwo.Description_Type = 'Default' THEN coalesce(tblOne.Code + ' ','') ELSE '' END) + TblTwo.[Description] AS String   
        FROM TableOne tblOne
        INNER JOIN TableTwo TblTwo ON tblOne.CodeID = TblTwo.CodeID
        WHERE TblTwo.Description_Type IN ('Default','Custom')
)ResultOne   ---- Return almost 20000 rows and take 1 sec to execute 
CROSS APPLY dbo.udf_TokenString(String) ---- return almost 194379 rows and take 3 mins to execute

Result of drived table ResultOne :
Desc_ID String
1 HCD.39877
2 HCD.39777 Personal test history of documents
4 CD.397 Unspecified document text
5 N.7 This is end of file

Script of function:

CREATE FUNCTION [dbo].udf_TokenString (@string VARCHAR(MAX))
RETURNS
@Results TABLE
(
    Item VARCHAR(255),
    location INT,
    Sequence INT IDENTITY PRIMARY KEY
)
AS
BEGIN


    DECLARE @Len INT, @Start INT, @end INT, @Cursor INT,@length INT

    SELECT @Cursor=1, @len=LEN(@string)

    WHILE @cursor<@len
       BEGIN

       SELECT @start=PATINDEX('%[^.A-Za-z0-9][A-Za-z0-9%]%',
                       ' '+SUBSTRING (@string,@cursor,50)
                       )-1

       IF @start<0 break               

       SELECT @length=PATINDEX('%[^.A-Z''a-z0-9-%]%',SUBSTRING (@string,@cursor+@start+1,50)+' ')  
       INSERT INTO @results(item, location)
           SELECT  SUBSTRING(@string,@cursor+@start,@length), @cursor+@start
       SELECT @Cursor=@Cursor+@Start+@length+1

       END
    RETURN

END


GO
AA.SC
  • 4,073
  • 4
  • 28
  • 45

0 Answers0