Communities

Writing
Writing
Codidact Meta
Codidact Meta
The Great Outdoors
The Great Outdoors
Photography & Video
Photography & Video
Scientific Speculation
Scientific Speculation
Cooking
Cooking
Electrical Engineering
Electrical Engineering
Judaism
Judaism
Languages & Linguistics
Languages & Linguistics
Software Development
Software Development
Mathematics
Mathematics
Christianity
Christianity
Code Golf
Code Golf
Music
Music
Physics
Physics
Linux Systems
Linux Systems
Power Users
Power Users
Tabletop RPGs
Tabletop RPGs

Dashboard
Notifications
Mark all as read
Q&A
Post

Python looping 300 000 rows

+3
−0

Based on my last question comes new one.
How to loop over 300 000 rows and edit each row string one by one? I have a list of 11-digit numbers stored in one single column in Excel, and I need to separate the digits according to this pattern: 2-2-1-3-3.

I use the code below to loop to test the solution for only 20 rows and it's working.

Example: 00002451018 becomes 00 00 2 451 018.

priceListTest contains the column Column1 which has these 11 digit numbers. Somehow I need to loop all over these 300 000 rows and use the get_slices to change the pattern for each row like from the example above and store it into the new column New Value.

The for index, row it's working very slowly when I have to use it for 300 000 rows. Maybe there is a better method, but I'm new to python.

Thanks in advance!

for index, row in priceListTest.iterrows(): 
    #print(index,row)
    def get_slices(n, sizes, n_digits=11):
        for size in sizes:
            n_digits -= size
            
            val, n = divmod(n, 10 ** n_digits)
            yield f'{val:0{size}}'

    n = row['Column1']
    newVar = (' '.join(get_slices(n, [2, 2, 1, 3, 3])))
    priceListTest.at[index,['New Value']] = newVar
Why does this post require moderator attention?
You might want to add some details to your flag.
Why should this post be closed?

5 comment threads

Parallel execution (1 comment)
The actual performance issue (5 comments)
Types (1 comment)
Create the function just once (3 comments)
A small note regarding MCVE (1 comment)

Comments on Python looping 300 000 rows

The actual performance issue
Alexei‭ wrote 2 months ago:

Can you provide the actual waiting time for running your code? Example: run your code for 10K times and tell us how much it took to end.

sfrow‭ wrote 2 months ago:

Actually you are right. For 10k rows it's taking 14 seconds, which is not so bad. For the whole list of 300k it will take 7-8 minutes.

elgonzo‭ wrote 2 months ago:

If 14 sec for 10k rows "is not so bad", then 7 min for 300k rows is not worse and not so bad either. Because, given 7 min for 300k rows, each of the thirty 10k rows in those 300k rows still take around 14 sec.

sfrow‭ wrote 2 months ago:

First time when I run it took me almost 1 hour, but I found that the file contain several wrong data in the column and this cause the problem for the long run.

hkotsubo‭ wrote 2 months ago:

sfrow‭ I've made this test and surprisingly using string slices is faster than doing the math. Try to change the algorithm and see if it makes some difference