Access 2010: Saving a database

OK, the script below will convert CSV files into fixed width - to use it:
[ul]
[li]Create a folder somewhere[/li][li]Create a text file in the folder and rename it script.vbs[/li][li]Open script.vbs in a text editor and paste in the code below[/li][li](for testing) create a CSV file in the folder named input.csv - it should have at least 4 columns of data in it[/li][li]Double-click script.vbs to run it[/li][li]A file named output.txt should appear containing the converted data[/li][/ul]

The parts of the code coloured red are where you configure the column selection, and the output formatting
The part coloured blue is the bit that actually writes the line in the output file
(so by adding more lines/fields to both of the above, and adapting them, you should be able to cope with any number of input columns, and any fixed width output)

(the lines that begin with ’ are just comments explaining the code)



Const strSourceFile = "input.CSV"
Const strTargetFile = "output.txt"
 
Dim strData
 
Dim objFSO, objSourceFile, objTargetFile
Set objFSO = CreateObject("Scripting.FileSystemObject")
 
Set objSourceFile = objFSO.OpenTextFile(strSourceFile, 1, True)
Set objTargetFile = objFSO.CreateTextFile(strTargetFile, True)
 
Do While Not objSourceFile.AtEndOfStream
	strData = objSourceFile.ReadLine
	StrDataFields = split(strdata, ",")
	
	'add lines (on per output field) and configure them for preformatting below
	'the values inside the brackets denote: the column number in the CSV file (zero is the first column), The desired output field width, right-justification (false-left justification) and the character to be used for padding strings up to length)
	f01 = PadTrim(StrDataFields(0), 8, false, " ")
	f02 = PadTrim(StrDataFields(1), 9, true, "0")
	f03 = PadTrim(StrDataFields(3), 5, false, ".")
	'So:
	'the first field in the CSV file will be forced to 8 characters long, left justified and padded with spaces
	'the second field in the CSV file will be forced to 9 characters long, right justified, and padded with zeroes
	'the third field in the CSV file is ignored (note, we only asked for columns 0, 1 and 3 - they're numbered from zero)
	'the fourth field in the CSV file will be forced to 5 characters long, left justified and padded with dots 
	
	
	
	'change this next line if you need to output more fields
	objTargetFile.WriteLine(f01 & f02 & f03)
	
Loop
 
Function padtrim (strinput, length, rightJ, padchar)
'This function forces a string to a specific length by either padding it (with a specified character), or truncating it
'You can choose whether the original value passed across will be padded to justify right(which also truncates left if too long) or vice versa
strinput = Trim(strinput)
padtrim = strinput
If Len(strinput) > length Then
If rightJ = True Then
padtrim = right(strinput, length)
Else
padtrim = Left(strinput, length)
End If
End If

If Len(strinput) < length Then
If rightJ = True Then
padtrim = String(length - Len(strinput), padchar) & strinput
Else
padtrim = strinput & String(length - Len(strinput), padchar)
End If
End If
End Function


So in the context of testing, this will convert this CSV file:


a,b,c,d
d,e,f,g
1,2,3,4
3,4,5,a,c,b

into this:


a       00000000bd....
d       00000000eg....
1       0000000024....
3       000000004a....


Thanks, Mangetout.

First half of the month, so I’m slammed right now. I’ll attempt this when things calm down a bit.

No problem. PM me if you need help setting it up

My office has a database we create and use each day.

I defined the database and forms several years ago. Then backed it up to a folder. That file gets copied fresh into a daily production folder.

Staff uses the forms to enter data throughout the day.

I have a script that copies and date stamps the daily production database into an archive folder.

A fresh copy of the database is copied into the daily production folder and we’re ready for the next day.

I run reports against the archived files.

I wrote a program that reads the records from the daily database and updates the full database.

The goal is keeping the daily database small. The full database has a fiscal year of records and is quite large. It gets archived at the end of each FY.

Just also to note, the above script only processes one file named ‘input.csv’ - if you want it to loop through process an indefinite batch of CSV files in a folder, we can make it do that.

Also, it has no error trapping of any kind in it - it relies on the inputs being perfect - if there’s a risk that might not always happen, we can add some safeties.

As I said, I only use Access to turn .csv files into fixed-position text files. I use Easytrieve to make the data ‘perfect’. As long as Access truncates fields instead of shifting rows (i.e., not outputting fixed-position records as Access 2003 did), That’s all I need.

It would be nice if I had a tool that would ‘clean up’ data, as I still have to do a bit of manual cleanup after running my programs. FWIW, I put names and addresses into a standard syntax. e.g., STREET = ST, STE/SUITE/UNIT/APT/TRLR/etc. = #, PARKWAY = PKWY, CORPORATION = CORP, NORTH = N, delete periods, and so on… except if the period is part of a distance or .COM/.NET/.ORG, PARKWAY or directionals are the name of the street, STREET or DRIVE are part of STREETER or DRIVER (also look for LANEY, PLACER, PLACEN), and so forth. Right now I do a manual cleanup the first time, and then compare new data to that file. Then I only have to clean up the new records, and that file becomes the new previous file. I do have to go back and do a manual cleanup periodically in case there are any name or address changes. But the system has worked well over the past decade. I keep hoping I’ll win the lottery so I don’t have to mess with it anymore. :stuck_out_tongue: