java - Regarding stitching of files -


I have a large file that has been returned in response to a query. I want the collected list to be given to the user To be shown. Before that when the file was small I would read them individually and add the result to one-third file, but reading still means a few seconds delay which users can not afford. Is there a way in which the file can be stitched and the overall result can be presented to the user?

Earlier I tried an order like "cat file 1 file2"> gt0> file 3 which is randomly in the file's file1 and file2 file3 but this command does not work when i < / P>

process p = runtime.gettimetime (). Exec ("cat file1 fiile2"> file3 "); BufferedReader BR = new BufferedReader (new InputStreamReader (p.getInputStream ());

The script you are using from the shell, your "aggregation" is actually adding the file.

If this is the case, then just do something like this Would be: prefix OutputStream os = ... byte [] buffer = new byte [8192]; // or maybe a bit big int nosRead; try (InputStream = new FileInputStream ("file1") } {Try (InputStream = New FileInputStream ("fiile2")} {} (try the nosRead = is.read (buffer))> gt; {os.write (buffer, 0, nosRead)}} { While using the external command to add files (nosRead = is.read (buffer)) gt; 0) {os.write (buffer, 0, nosRead);}}

There is no need to do. "Stitching" directly (as above) means that you will start sending data to the user immediately, assuming that it is for your "sewing", it uses the Reader / Writers It should not be necessary to do this, and by avoiding it becomes a bit faster.

Comments

Popular posts from this blog

c - Mpirun hangs when mpi send and recieve is put in a loop -

python - Apply coupon to a customer's subscription based on non-stripe related actions on the site -

java - Unable to get JDBC connection in Spring application to MySQL -