Git Workflow for Large, Live Site?
The Wordpress site I am working on has thousands of posts, is updated hourly, and has dozens of plugins and hacks to get it to work in the way it does.
Can someone explain a git workflow that would allow the following?
Essentially, I want to take everything that is currently there — all Wordpress files, posts, plugins, etc. – and work on it locally. This would include, I presume, copying the database and running it locally. (I am aware of the Mark Jaquith solutions, but obviously can't start afresh with his Wordpress skeleton.)
There are specific reasons why I want to work from the same database and post base, which is unfortunate, but necessary.
I then want to make changes and test on a local environment, and then push it to the live server, along with the database that I will have been using in the local environment. I will have a team which can keep the development database up to date with the content that is being created on the live database (as I don't think there is any other way, right?). I presume also that I will have to manually keep the media uploaded to the live site in sync with the media folder on my local setup.
As such, there will be a point where I want to push what I am working on locally to the live server. I presume everything that is in my git repository (that's different) should just overwrite what is on the live server, and I should upload the development/local database contents to my AWS account and then point the live wp-config file to that database.
Am I thinking this through in the right way? How do large sites like the NY Times deal with conflicting databases with content management systems and other issues when pushing to the live server?
Please also note that we are running Wordpress 3.5 and there is no prospect or possibility of us upgrading because of the specifics of the situation.