ChatGPT解决这个技术问题 Extra ChatGPT

How do I import CSV file into a MySQL table?

I have an unnormalized events-diary CSV from a client that I'm trying to load into a MySQL table so that I can refactor into a sane format. I created a table called 'CSVImport' that has one field for every column of the CSV file. The CSV contains 99 columns , so this was a hard enough task in itself:

CREATE TABLE 'CSVImport' (id INT);
ALTER TABLE CSVImport ADD COLUMN Title VARCHAR(256);
ALTER TABLE CSVImport ADD COLUMN Company VARCHAR(256);
ALTER TABLE CSVImport ADD COLUMN NumTickets VARCHAR(256);
...
ALTER TABLE CSVImport Date49 ADD COLUMN Date49 VARCHAR(256);
ALTER TABLE CSVImport Date50 ADD COLUMN Date50 VARCHAR(256);

No constraints are on the table, and all the fields hold VARCHAR(256) values, except the columns which contain counts (represented by INT), yes/no (represented by BIT), prices (represented by DECIMAL), and text blurbs (represented by TEXT).

I tried to load data into the file:

LOAD DATA INFILE '/home/paul/clientdata.csv' INTO TABLE CSVImport;
Query OK, 2023 rows affected, 65535 warnings (0.08 sec)
Records: 2023  Deleted: 0  Skipped: 0  Warnings: 198256
SELECT * FROM CSVImport;
| NULL             | NULL        | NULL           | NULL | NULL               | 
...

The whole table is filled with NULL.

I think the problem is that the text blurbs contain more than one line, and MySQL is parsing the file as if each new line would correspond to one databazse row. I can load the file into OpenOffice without a problem.

The clientdata.csv file contains 2593 lines, and 570 records. The first line contains column names. I think it is comma delimited, and text is apparently delimited with doublequote.

UPDATE:

When in doubt, read the manual: http://dev.mysql.com/doc/refman/5.0/en/load-data.html

I added some information to the LOAD DATA statement that OpenOffice was smart enough to infer, and now it loads the correct number of records:

LOAD DATA INFILE "/home/paul/clientdata.csv"
INTO TABLE CSVImport
COLUMNS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;

But still there are lots of completely NULL records, and none of the data that got loaded seems to be in the right place.

And if you're on OSX Sequel Pro has an awesome import tool and it's FREE ;-)
It's amazing to me that the original poster answered his own question better than anyone else... I don't know why so many people are so ready to give software recommendations when there is an existing SQL command, which can be programmatic rather than UI-based. I don't know about anyone else, but programmatic means to me that I can have scripts set up to auto-import files on timestamps, whereas UI-based is purely manual.
@ChrisCirefice: I think the accepted answer explains that well. He needed some help manually creating the 'load data' command, which a graphical program can help with. Once the graphical program had created the 'load data' command, he could then reuse it programmatically.
@Merrick this worked a treat on osx
@ChrisCirefice scripts are awesome for repeated events; but GUIs are better for one-time things, because you don't have to figure out all the esoteria just to do the one thing one time

B
Bob Stein

Use mysqlimport to load a table into the database:

mysqlimport --ignore-lines=1 \
            --fields-terminated-by=, \
            --local -u root \
            -p Database \
             TableName.csv

I found it at http://chriseiffel.com/everything-linux/how-to-import-a-large-csv-file-to-mysql/

To make the delimiter a tab, use --fields-terminated-by='\t'


mysqlimport uses LOAD DATA INFILE... behind the scenes, so it's pretty much the same thing.
As with LOAD DATA INFILE, you need to create a table before you can use mysqlimport.
@MladenJablanović, it's definitely not the same thing. Try importing 1bil rows. You'll be surprised what a massive difference it makes in terms of performance
also need --fields-optionally-enclosed-by=\" and `--fields-escaped-by=\`
Should be correct answer. Talks about CLI like the question was dicussing, as opposed to a GUI.
F
Franck Dernoncourt

The core of your problem seems to be matching the columns in the CSV file to those in the table.

Many graphical mySQL clients have very nice import dialogs for this kind of thing.

My favourite for the job is Windows based HeidiSQL. It gives you a graphical interface to build the LOAD DATA command; you can re-use it programmatically later.

https://i.stack.imgur.com/RTux4.png

Screenshot: "Import textfile" dialog

To open the Import textfile" dialog, go to Tools > Import CSV file:

https://i.stack.imgur.com/2HHSa.png


For Mac OSX, use Sequel Pro.
I just tried it and it requires me to create the table first... instead of using the column names.
You have to select a table before you can continue... and since the whole point is not having to make the table...
Note that in Linux, HeidiSQL works very well under Wine.
@Paul the very point of the answer is that GUI tools can make matching import columns to table columns easier.
D
DareDevil

Simplest way which I have imported 200+ rows is below command in phpmyadmin sql window

I have a simple table of country with two columns CountryId,CountryName

https://i.stack.imgur.com/yOSm3.png

here is command:

LOAD DATA INFILE 'c:/country.csv' 
INTO TABLE country 
FIELDS TERMINATED BY ',' 
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS

Keep one thing in mind, never appear , in second column, otherwise your import will stop


',' -> '\t', '"' -> '' in case of TSV files and remove the last line if no header. (hope search crawlers index this).
If it's a local file, you might need to LOAD DATA LOCAL INFILE. If this throws an error 1148 "used command is not allowed", you can enable it by running mysql on the command line with --local-infile.
i got the error : ERROR 1045 (28000): Access denied for user 'user'@'%' (using password: YES)
I am getting this error: The MySQL server is running with the --secure-file-priv option so it cannot execute this statement
perfect answer just run it with LOAD DATA LOCAL INFILE and all is fine
G
G M

I Used this method to import more than 100K records (~5MB) in 0.046sec

Here's how you do it:

LOAD DATA LOCAL INFILE  
'c:/temp/some-file.csv'
INTO TABLE your_awesome_table  
FIELDS TERMINATED BY ',' 
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
(field_1,field_2 , field_3);

It is very important to include the last line , if you have more than one field i.e normally it skips the last field (MySQL 5.6.17)

LINES TERMINATED BY '\n'
(field_1,field_2 , field_3);

Then, assuming you have the first row as the title for your fields, you might want to include this line also

IGNORE 1 ROWS

This is what it looks like if your file has a header row.

LOAD DATA LOCAL INFILE  
'c:/temp/some-file.csv'
INTO TABLE your_awesome_table  
FIELDS TERMINATED BY ',' 
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
(field_1,field_2 , field_3);

I imported 16k rows and 48 columns. Thank you, buddy.
This is the best solution. For anyone curious how fast this is: I imported 3.2 million rows in under 14 secs on a $20/mo AWS Lightsail instance, with the onboard MySQL server (not a high performance RDS). Awesome!
In this line, (field_1,field_2 , field_3); does it refer to the list of columns in a .csv file or in a table? i.e., what are these fields?
@tera_789 The ones in your csv file that will probably match your table
P
Pang

phpMyAdmin can handle CSV import. Here are the steps:

Prepare the CSV file to have the fields in the same order as the MySQL table fields. Remove the header row from the CSV (if any), so that only the data is in the file. Go to the phpMyAdmin interface. Select the table in the left menu. Click the import button at the top. Browse to the CSV file. Select the option "CSV using LOAD DATA". Enter "," in the "fields terminated by". Enter the column names in the same order as they are in the database table. Click the go button and you are done.

This is a note that I prepared for my future use, and sharing here if someone else can benefit.


This is good and simple. I prefer creating tables and columns via SQL (so I skip Step #9) and inserting data via importing CSV. Do not forget to set NULL in the CSV for any auto-incrementing fields/columns.
Note that phpMyAdmin fails miserably when Cyrillic chars are included into the CSV file, no matter you tell it to use utf-8.
Don't forget to change the import size if you're importing large CSV file. BTW its not a good option for large CSV files.
This requires downloading the .csv file to a web client first as you cannot browse to a local csv file.
Yes you can, you need to check the checkbox marked 'Local Keyword' and you can use a local csv file
V
Vitaliy Pak

If you are using MySQL Workbench (currently 6.3 version) you can do this by:

Right click on "Tables"; Chose Table Data Import Wizard; Chose your csv file and follow the instructions (JSON also could be used); The good thing is that you can create a new table based on the csv file you want to import or load data to an existing table

https://i.stack.imgur.com/5az8h.jpg


+1. I used this because mysql kept giving me errors for LOAD DATA INFILE and mysqlimport ("this is not supported on this version of mysql")
This method works, but it's kind of slow. I would have thought that using this feature would have built a massive INSERT query and tried to do it all at once, but it looks like doing it this way actually runs once INSERT per row.
This method saved me. I was getting a lot of errors with LOAD DATA, but in need of hurry. I highly recommend if any reader is having problems with LOAD DATA.
Using Workbench 8.0.22 this feature is far from stable. Keeps crashing, importing 0 lines or just freezes. Tried it with Our World in Data's COVID-19 CSV: github.com/owid/covid-19-data/tree/master/public/data
l
lafncow

You can fix this by listing the columns in you LOAD DATA statement. From the manual:

LOAD DATA INFILE 'persondata.txt' INTO TABLE persondata (col1,col2,...);

...so in your case you need to list the 99 columns in the order in which they appear in the csv file.


D
David

Try this, it worked for me

    LOAD DATA LOCAL INFILE 'filename.csv' INTO TABLE table_name FIELDS TERMINATED BY ',' ENCLOSED BY '"' IGNORE 1 ROWS;

IGNORE 1 ROWS here ignores the first row which contains the fieldnames. Note that for the filename you must type the absolute path of the file.


This is the best answer. Why use another tool when a single SQL command will do?
Do- u know how to make this work when u r trying to load the file into mysql running on a server? It asks me access denied for the file (password). Where to enter password of csv file location?
J
Juan

I see something strange. You are using for ESCAPING the same character you use for ENCLOSING. So the engine does not know what to do when it founds a '"' and I think that is why nothing seems to be in the right place. I think that if you remove the line of ESCAPING, should run great. Like:

LOAD DATA INFILE "/home/paul/clientdata.csv"
INTO TABLE CSVImport
COLUMNS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;

Unless you analyze (manually, visually, ... ) your CSV and find which character uses for escape. Sometimes is '\'. But if you do not have it, do not use it.


u
user1464271

The mysql command line is prone to too many problems on import. Here is how you do it:

use excel to edit the header names to have no spaces

save as .csv

use free Navicat Lite Sql Browser to import and auto create a new table (give it a name)

open the new table insert a primary auto number column for ID

change the type of the columns as desired.

done!


r
ruvim

Yet another solution is to use csvsql tool from amazing csvkit suite.

Usage example:

csvsql --db mysql://$user:$password@localhost/$database --insert --tables $tablename  $file

This tool can automatically infer the data types (default behavior), create table and insert the data into the created table. --overwrite option can be used to drop table if it already exists. --insert option — to populate the table from the file.

To install the suite

pip install csvkit

Prerequisites: python-dev, libmysqlclient-dev, MySQL-python

apt-get install python-dev libmysqlclient-dev
pip install MySQL-python

G
Georgy Gobozov

In case if you using Intellij https://www.jetbrains.com/datagrip/features/importexport.html

https://i.stack.imgur.com/79O2N.png


M
Mehdi

I use mysql workbench to do the same job.

create new schema open newly created schema right click on "Tables" and select "Table Data Import Wizard" give the csv file path and table name and finally configure your column type because the wizard set default column type based on their values.

Note: take a look at mysql workbench's log file for any errors by using "tail -f [mysqlworkbenchpath]/log/wb*.log"


Thanks so much for your answer - being new to MySQL I didn't know about this - it helped me with CSV. Now, I also need to import 10 tables from Access - do you think the simplest way will be to export these tables into Excel, from Excel to CSV and then use these steps?
I found that this method doesn't import all rows :( Out of 5,342 rows it only imported 2,485 rows for me. Why it that?
Hey Naomi, if you check the mysqlworkbench log it will show you why it stopped importing data. You may have some null values or mismatch type in your DB and CSV file. But I highly recommend to follow Juan's answer(right after me). His solution is better and cleaner than me.
M
Martijn Pieters

How to import csv files to sql tables

Example file: Overseas_trade_index data CSV File

Steps:

Need to create table for overseas_trade_index. Need to create columns related to csv file. SQL Query: ( id int not null primary key auto_increment, series_reference varchar (60), period varchar (60), data_value decimal(60,0), status varchar (60), units varchar (60), magnitude int(60), subject text(60), group text(60), series_title_1 varchar (60), series_title_2 varchar (60), series_title_3 varchar (60), series_title_4 varchar (60), series_title_5 varchar (60), ); Need to connect mysql database in terminal. =>show databases; =>use database; =>show tables; Please enter this command to import the csv data to mysql tables. load data infile '/home/desktop/Documents/overseas.csv' into table trade_index fields terminated by ',' lines terminated by '\n' (series_reference,period,data_value,status,units,magnitude,subject,series_title1,series_title_2,series_title_3,series_title_4,series_title_5); Find this overseas trade index data on sqldatabase: select * from trade_index;


z
zipzit

If you are using a windows machine with Excel spreadsheet loaded, the new mySql plugin to Excel is phenomenal. The folks at Oracle really did a nice job on that software. You can make the database connection directly from Excel. That plugin will analyse your data, and set up the tables for you in a format consistent with the data. I had some monster big csv files of data to convert. This tool was a big time saver.

http://dev.mysql.com/downloads/windows/excel/

You can make updates from within Excel that will populate to the database online. This worked exceedingly well with mySql files created on ultra inexpensive GoDaddy shared hosting. (Note when you create the table at GoDaddy, you have to select some off-standard settings to enable off site access of the database...)

With this plugin you have pure interactivity between your XL spreadsheet and online mySql data storage.


M
Michael Tomar

I know that my answer is late, but I'd like to mention a few other ways to do it. The easiest one is using command line. The steps will be the following:

Accessing the MySQL CLI by entering the below command:

mysql -u my_user_name -p

Creating a table in the database

use new_schema; CREATE TABLE employee_details ( id INTEGER, employee_name VARCHAR(100), employee_age INTEGER, PRIMARY KEY (id) );

Importing the CSV file into a table. We can either mention the file path or store the file in the default directory of the MySQL server.

LOAD DATA INFILE 'Path to the exported csv file' INTO TABLE employee_details FIELDS TERMINATED BY ',' IGNORE 1 ROWS;

It's the only one of many solutions, I found it in this tutorial If loading CSV files into MySQL database is your daily task, then it'll be better to automate this process. In this case you can use some 3rd-party tools that allows you to load data in schedule.


S
Suwarnakumar Kanapathipillai

PHP Query for import csv file to mysql database

$query = <<<EOF
            LOAD DATA LOCAL INFILE '$file'
             INTO TABLE users
             FIELDS TERMINATED BY ','
             LINES TERMINATED BY '\n'
             IGNORE 1 LINES
            (name,mobile,email)
    EOF;
if (!$result = mysqli_query($this->db, $query))
   {
        exit(mysqli_error($this->db));
   }

**Sample CSV file data **

name,mobile,email
Christopher Gritton,570-686-3439,ChristopherKGritton@inbound.plus
Brandon Wilson,541-309-5149,BrandonMWilson@inbound.plus
Craig White,516-795-8065,CraigJWhite@inbound.plus
David Whitney,713-214-3966,DavidCWhitney@inbound.plus

M
Magige Daniel

Here is sample excel file screen shot:

https://i.stack.imgur.com/UkOaA.jpg

Save as and choose .csv.

And you will have as shown below .csv data screen shot if you open using notepad++ or any other notepad.

https://i.stack.imgur.com/QVFEV.jpg

Make sure you remove header and have column alignment in .csv as in mysql Table. Replace folder_name by your folder name

LOAD DATA LOCAL INFILE 'D:/folder_name/myfilename.csv' INTO TABLE mail FIELDS TERMINATED BY ',' (fname,lname ,email, phone);

If big data, you can take coffee and have it load!.

Thats all you need.


S
Srikrushna

Change servername,username, password,dbname,path of your file, tablename and the field which is in your database you want to insert

<?php
    $servername = "localhost";
    $username = "root";
    $password = "";
    $dbname = "bd_dashboard";
    //For create connection
    $conn = new mysqli($servername, $username, $password, $dbname);

    $query = "LOAD DATA LOCAL INFILE 
                'C:/Users/lenovo/Desktop/my_data.csv'
                INTO TABLE test_tab
                FIELDS TERMINATED BY ','
                LINES TERMINATED BY '\n'
                IGNORE 1 LINES
                (name,mob)";
    if (!$result = mysqli_query($conn, $query)){
        echo '<script>alert("Oops... Some Error occured.");</script>';
        exit();
            //exit(mysqli_error());
       }else{
        echo '<script>alert("Data Inserted Successfully.");</script>'
       }
    ?>

A
Amit Ray

I did it in simple way using phpmyadmin. I followed the steps by @Farhan but all data were eltered in single column. How I did:

Created a CSV file and deleted the header row with column names. Kept only data. I created a table with column names matching the csv columns. Remember to assign appropriate types to each column. I just selected the import and went to import tab. In browse I selected the CSV file and kept all options as it is. To my surprise all the data got imported successfully in their appropriate columns.