Details
-
Type:
Task
-
Status: Open
-
Priority:
Minor
-
Resolution: Unresolved
-
Fix Version/s: 10.1
-
Component/s: None
-
Labels:None
Description
In order of priority :
1 - streaming data directly to the import
2 - generating rejected records file
3 - ability to stop the import after given number of errors
4 - metadata storing in file delimiter, destination table
Gliffy Diagrams
Attachments
Activity
- All
- Comments
- Work Log
- History
- Activity
- Transitions
Here is a trick to import from stdin :
#!/bin/bash MAXTHREAD="`cat /proc/cpuinfo | grep processor | wc -l`" DELAY=0.1 import_csv_gz() { echo date echo "$1" mkfifo ${1}_fifo mysqlimport --fields-terminated-by=';' --fields-optionally-enclosed-by='\"' --lines-terminated-by='\n' --local test ${1}_fifo & zcat $1 > ${1}_fifo rm ${1}_fifo mv $1 /home/mysql/csv/done } ## MAIN echo "$MAXTHREAD CPU found!" for i in $(ls /home/mysql/csv/*csvgz); do import_csv_gz "$i" & while (( $(jobs | wc -l) >= $MAXTHREAD )); do sleep $DELAY jobs > /dev/null done done wait echo "all done!"