[xsl] Memory problem when stokenize big data

Subject: [xsl] Memory problem when stokenize big data
From: "Richard Zhang" <richard_zhang@xxxxxxxxxx>
Date: Tue, 10 Jan 2006 09:30:00 -0500
Thanks for your reply to my prior question about breaking down strings.

Now I am trying to use stokenize to breakdown a big data.

The input big data is like:

         <textdata sep=" &#x000A;&#x000D;">
           5.1 4.9 4.7 4.6 5 5.4 4.6 5 4.4 4.9
           ...
           ...
         </textdata>
           ...
           ...
         <textdata sep=" &#x000A;&#x000D;">
           5.1 4.9 4.7 4.6 5 5.4 4.6 5 4.4 4.9
           ...
           ...
         </textdata>

and my xsl template is like:

 <xsl:template match="textdata">
   <data>
   <xsl:for-each select="str:tokenize(.,' &#x000A;&#x000D;')">
     <e>
     <xsl:value-of select="."/>
     </e>
   </xsl:for-each>
   </data>
 </xsl:template>

The textdata can be very big. My question is, will the stokenzing have problem when handling big data? if yes, how big is the data that stokenize can handle? I ran the transformation in Jbuilder and it shows some '10mb help left' problem.

Thanks a lot.
Richard


Current Thread