OsakaWebbie
Programmer
I keep running into roadblocks on what seems like it ought to be simple. Perhaps someone else has some insight.
Bottom line goal: take a big (8 MB) file of SQL insert commands, convert the Shift-JIS text in it to Unicode, and insert the records into a waiting database on a web hosting server. I have Apache/PHP/MySQL running on my local Windows machine as well (in fact, that's where the big file came from - mysqldump). Obviously the "right" way is a Perl script run from the command line to convert the text, and piping the file into the mysql command. But I don't have access to the command line on the hosting server, and I tried using Perl as CGI but couldn't get it to work somehow. Customer support with this particular hosting service seems to be nonexistent. So I am forced to make PHP do all the work.
I'm running phpMyAdmin on the hosting server, so I started out thinking that I should just prepare an SQL file to upload using its utilities. But the Japanese text needs to be converted first. The hosting server has a PHP module with commands for doing the conversion, but opening files for writing doesn't work (no doubt the Apache user doesn't have write permission on the directory). So I wrote a script that opens a file of SQL, parses it into complete SQL commands, converts any Japanese text, and runs each SQL command against the database. It runs great on small files, but the one table that has 120,000 records causes the PHP processer to timeout after 30 sec. Figuring I only have to do this once, yesterday I spent the afternoon pulling 5000 line chunks out of the file using a text editor, ftp-ing each chunk, and running the script on it. But I used the same working file over and over, so I don't still have the data in chunks, and now, due to my own ignorance (my fields weren't long enough, as Unicode uses more bytes than Shift-JIS), I need to do it again.
I thought I had a great idea - I would write a quick little PHP script to run locally (where I have write permission) just to break the file into pieces. And it tests fine on small files. But when I try it with the big one, my browser gives me a standard error claiming that the file (the PHP one!) is not found or has a DNS problem (yeah, right - I'm running on localhost!). This happens no matter whether I specify the filename in the URL (mycode.php?f=sql.txt) or hardcode the filename. I assume it's some sort of timeout problem again (although it gives up in about 5-10 seconds, not 30), with a less informative error message and less productivity (the script on the hoster that runs the SQL commands would at least do as many as it could in 30 seconds!).
Do I need to give up and spend another afternoon breaking the file up by hand? Any suggestions on how I could automate this?
Bottom line goal: take a big (8 MB) file of SQL insert commands, convert the Shift-JIS text in it to Unicode, and insert the records into a waiting database on a web hosting server. I have Apache/PHP/MySQL running on my local Windows machine as well (in fact, that's where the big file came from - mysqldump). Obviously the "right" way is a Perl script run from the command line to convert the text, and piping the file into the mysql command. But I don't have access to the command line on the hosting server, and I tried using Perl as CGI but couldn't get it to work somehow. Customer support with this particular hosting service seems to be nonexistent. So I am forced to make PHP do all the work.
I'm running phpMyAdmin on the hosting server, so I started out thinking that I should just prepare an SQL file to upload using its utilities. But the Japanese text needs to be converted first. The hosting server has a PHP module with commands for doing the conversion, but opening files for writing doesn't work (no doubt the Apache user doesn't have write permission on the directory). So I wrote a script that opens a file of SQL, parses it into complete SQL commands, converts any Japanese text, and runs each SQL command against the database. It runs great on small files, but the one table that has 120,000 records causes the PHP processer to timeout after 30 sec. Figuring I only have to do this once, yesterday I spent the afternoon pulling 5000 line chunks out of the file using a text editor, ftp-ing each chunk, and running the script on it. But I used the same working file over and over, so I don't still have the data in chunks, and now, due to my own ignorance (my fields weren't long enough, as Unicode uses more bytes than Shift-JIS), I need to do it again.
I thought I had a great idea - I would write a quick little PHP script to run locally (where I have write permission) just to break the file into pieces. And it tests fine on small files. But when I try it with the big one, my browser gives me a standard error claiming that the file (the PHP one!) is not found or has a DNS problem (yeah, right - I'm running on localhost!). This happens no matter whether I specify the filename in the URL (mycode.php?f=sql.txt) or hardcode the filename. I assume it's some sort of timeout problem again (although it gives up in about 5-10 seconds, not 30), with a less informative error message and less productivity (the script on the hoster that runs the SQL commands would at least do as many as it could in 30 seconds!).
Do I need to give up and spend another afternoon breaking the file up by hand? Any suggestions on how I could automate this?