lundi 29 juin 2015

PHP Optmization: Processing millions of MySQL records?

I've got a handful of databases with, potentially, millions of records that I need to run some backend services on pretty frequently (backups, reporting, etc)

Currently I'm batching my requests in batches of 1k to try and speed things up. It helps a little, but not really enough to notice. Some reports customers want to generate can take a few days - which is preposterous.

Then there's backups. My company has a unique way to store our records and we need to process each and every one individually so I can't just export everything into a file and save it off site.

So, as you can imagine, this causes a lot of backlog pretty quickly.

What are some methods I can look into for speeding this up?

I've already examined using mysqli_poll to help, but I still end up with blocking methods while each batch gets processed.

Is threading with pthreads really my only option to dramatically speed things up at this point or do I need to convince the superiors to use something that's actually threaded to make this actually work.

Thanks in advance for your help!

Aucun commentaire :

Enregistrer un commentaire