If you’ve ever tried to program a large detail report in PL/SQL, you know what I’m talking about. The standard file handling arrangements of the UTL package are extremely rigid and inelegant, so you think you’ll be able to circumvent them by using DBMS_OUTPUT.PUT_LINE calls and spooling the output to the report file.
But alas, you find you can’t output more than 1M bytes using this approach. The system upchucks an ORA-10027 error message saying that you’ve overflowed the buffer capacity.
So my question is, has anyone found a way to circumvent this problem, without abandoning PL/SQL altogether? Is there any way to manage the buffer and get all the output that’s coming to you?
isn’t it just simpler to INSERT all generated data into a “straight” SQL table as each record is generated, then do a “SELECT * FROM table_name” after the PL portion has finished processing to bring it all up in a formatted report? also allows for additional sorting, grouping, or other miscellaneous manipulation afterward.
either that, or get the DBA to increase the buffer size.
don’t hit me, i am trying to be helpful.
wolfman, I did try that, but you’re only allowed to go up to 1M. Your code will compile if you use a higher number, but it will still fail on the 1000001st character.
lachesis, with your scenario I’d have to create the table before the PL/SQL part, right? Actually it wouldn’t be a problem, since I have to have a SQL script to call the PL/SQL proc anyway, and I can plan the creation of each report table separately since there’s a separate SQL script for each proc. But there’s still the problem that this requires an application to create database tables on the fly, and possibly large ones. This isn’t likely to delight my local DBA, not if my experience with DBA’s is any guide.
well, either that or you just have one standard table for all possible PL/SQL reports. make all the data into one big concatenated field. you can SUBSTRING it apart when you do the report.
ugly, i know, but that way you could have a “one size fits all” table that always just sits out there, waiting for the next PL/SQL run to load it up. (just do remember to TRUNCATE the table at the beginning of the SQL routine that calls the PL/SQL code.)
Spectre, how about writing all of it to a CLOB? Hmm, I’m trying to figure out how to write the CLOB out then… I’ve never worked with LOBs much.
lachesis, I don’t see what that buys him. If he’s going to have to manipulate the data from the standard table, how is he better off than when he started? He might as well just create a view (well, unless the report is very complex and can’t be handled in a single SQL statement).
Spectre, what are you trying to do that violates DBMS_Util limitations? I’ve never found them to be that constraining.
er, make that utl_file, not dbms_util. I never can remember syntax, even when I use it every day!
and btw, what the hell kind of report goes more than 1 M? Good lord, man, War and Peace isn’t that many characters! I’ve never seen a text document more than a couple hundred K. I’m serious here; a report that big is just begging to never be opened. If your requirements are forcing that size, I strongly recommend you go back to your users/analysts and work out some kind of summary report that people will actually look at!
If you have that much data that people want to have available just in case, I strongly recommend one of the OLAP tools, such as Cognos PowerPlay. These allow data to be rolled-up or expanded as the user needs. They are ideal for analysts who aren’t sure when they start out where their analysis is going to take them, as they allow data to be sliced and diced in a large variety of ways.
I’m sorry, I’ve hijacked this. But the idea of a report that large is really disturbing! More data does not necessarily make a better report - in fact, the reverse is often true.
Sorry for the lecture; I’ve been programming since 1977, and in PL/SQL and datawarehousing for the past five years or so. I wish you the best of luck!
BTW, where is your company and are they hiring? My current contract is ending!
In our installation at least, the constraining factor of UTL_FILE is that you can only write to a directory that’s specified in the database configuration file. It’s just not flexible.
It’s actually not a report meant for eyes, but rather a csv file meant for importation into Access. This is done by the users for further analysis and manipulation of the data, and yes, I realize it’s a symptom of a gap in the system that we should address elsewhere.
What you can do is to write the file to a local directory, and then use java to move the file wherever you want it. I have the code for a java class that allows you to execute OS commands (and in fact got it from the web - it’s available all over), and can send it to you tonight if you like.
Gee, I haven’t found the limitation of the directory having to be specified in a database config file. Is that an Oracle limitation or a local system limitation, do you know? I have run across the annoying limitation that you have to write to a system disk of the Oracle server, hence my suggestion above.
If you’re importing this into Access, maybe you’d be better off writing a VB routine in access using linked tables? You might not even need code, depending on the complexity of your report - if you can make a view that is the report, you could just link to it in Access, then copy that “table” to a local table. Sounds like a very simple script. Or do you have firewall or other location limitations that prevent such a linkage?
what does it buy him? in the normal course of events, not a blessed thing, really. i did say it would be a messy, painful option. but he did mention that his DBA was likely to have kittens if new tables were created every time he ran a PL/SQL routine. so i was trying to give him a “universal table”-type option to use instead.
however, now that Spectre has mentioned that all the data generated is supposed to be exported to Access, it’s actually slightly more of a workable idea, BION. since (i’m assuming) he’ll want to create a flat ASCII file that will upload into the Access database, a single-record-holds-everything table would be perfectly fine. everything can just get spooled out into a target directory, and from there FTPed (or whatever) over to wherever Access can pick it up. (assuming that the two – PL/SQL and Access – are running on different hardware environments.)
oh yes… presumably he’s running PL/SQL because complex comparisons/analyses/reformattings are required on the data, above and beyond what even the best SQL coding could handle. (and i’ve “taught” SQL to do some fairly interesting little things in the course of generating reports around here.)