1024programmer Java Large data volume processing in java, using ibatis to insert 5 million pieces of data, low efficiency, how to solve it

Large data volume processing in java, using ibatis to insert 5 million pieces of data, low efficiency, how to solve it

Built a banking system to read text data and insert it into the database. The amount of data is too large. Now we use spring to manage transactions uniformly and use ibatis batch processing to insert data. It takes 36 seconds for 10W items, 6 minutes for 100W items, and half an hour for 500W items. How can I speed up the speed?

I have considered using multi-threading, but in fact the program does not consume much time when reading and writing files, and I read text line by line. If I use RandomAccessFile, will the positioning be inaccurate? What should we do if multi-threading is only used to handle insertion work? Because the data is read and written into batch statements at the same time, it is not easy to process.

There is another question, that is, using transactions, these 5 million pieces of data should be written to the cache of the database first. Will the cache of the database not overflow?

The following is my code, I hope someone can give me some guidance. I don’t have much work experience. If there is anything wrong with the code, please point it out. Thank you.

                 FileInputStream fis;

SqlMapClient smc = this.getSqlMapClient();

try {

fis = new FileInputStream(new File(fileName));

BufferedReader br = new BufferedReader(new InputStreamReader(fis,”gbk”));

String str = null;

smc.startTransaction();

smc.startBatch();

this.getSqlMapClientTemplate().delete(“deleteData1”);

int num = 0;

while((str=br.readLine())!=null)

{

String[] arr = str.split(“\\|”);

OdsStandingBook osb = new OdsStandingBook();

Field[] field = osb.getClass().getDeclaredFields();

for(int i=0;i {

String data = field[i].toString();

data = data.substring(data.lastIndexOf(“.”)+1);

Method method = getSetMethod(osb.getClass(), data);

method.invoke(osb, arr[i].trim());

}

osb.setJuridicalPerson(“1”); //Reflection was used earlier to assign values ​​to the attributes. These three attributes have no corresponding values ​​in the text file and must be set manually.

osb.setCreateDate(DateUtil.getCurrentDate());

osb.setCustomerType(arr[4].substring(0, 1));

smc.insert(“insertData1”,osb);

num++;

if(num%500==0)

{

smc.commitTransaction();

}

}

smc.commitTransaction();

} catch (Exception e) {

LcDebug.logWarning(e.getMessage());

}

finally

{

try {

smc.endTransaction();

} catch (SQLException e) {

e.printStackTrace();

}

}

5 solutions

#1


Inserting too much data means it is slow. The bottleneck is the program or the database.

#2


Inserting too much data means it is slow. The bottleneck is not the program, but the database.

#3


Since the content of the text is inserted into the database, you can try using a script to import the text into the database. . .

#4


This reply was deleted by the administrator at 2014-03-16 14:21:36

#5


Never used ibatis, give it a try

smc.insert(“insertData1”,osb);

Whether this line executes only one INSERT statement, or select first and then update, this has a greater impact on performance.

Or manually spell the insert statement and put a few more records after values

This article is from the internet and does not represent1024programmerPosition, please indicate the source when reprinting:https://www.1024programmer.com/large-data-volume-processing-in-java-using-ibatis-to-insert-5-million-pieces-of-data-low-efficiency-how-to-solve-it/

author: admin

Previous article
Next article

Leave a Reply

Your email address will not be published. Required fields are marked *

Contact Us

Contact us

181-3619-1160

Online consultation: QQ交谈

E-mail: 34331943@QQ.com

Working hours: Monday to Friday, 9:00-17:30, holidays off

Follow wechat
Scan wechat and follow us

Scan wechat and follow us

Follow Weibo
Back to top
首页
微信
电话
搜索