What is the best way to automatically delete table rows (free plan)? - heroku

I am using the free plan for a personal project, so can't afford to move to a better (paid) one. Unfortunately, my database filled and stupid me did a reset, thinking it will delete only the rows, leaving the structure intact. But it deleted everything, so now I have to rebuild it.
Before I start, I'd like to know if there is a way to empty old records/rows from the database. If I could manage to do this automatically (ideally there are some heroku settings that can do this, something I've missed), so I won't reach the 10000 monthly limit, it will be great.

There are no safeguards built into the platform to delete rows on your behalf. You will need to do it manually or teach one of your applications to do it programmatically.

Related

Working with TFS offline and then going back online

When the TFS server is down and developers work offline, when the server comes back up will it be business as usual even though they made local changes?
Will it just be a pending change at that point?
When you go offline you can carry on working on your files. When you go back online it will check to see if anyone has checked in/out the files that you have been working on.
If none of them have changed, it will check them out for you and you carry on as normal.
If someone has checked in/out then you will have to attempt to merge the files either at that point or on check-in. Sometimes this will auto-merge, others you'll have to do it yourself.
How big a problem this is depends on how many overlapping files are going to be edited. Merging can be a pain, so I'd try to keep that server offline for a short time if possible. If you're using them, avoid changing auto-generated content, such as a Linq2Sql .dbml file as these can be a nightmare to merge.

Star Team to TFS 2010 Migration with history

I want to migrate from star team 2005 to TFS 2010 with HISTORY. Is there any tool or any way where I can do it cost effectively. I know about Timely Migration tool, but it is too expensive.
There is no tool out there to do this. You are stuck with paying for Timely Migration or writing it yourself. Capturing history from StarTeam is extremely complex. The reason for this is because of what the view looks like historically. You can roll back the view to a point in time, and this alone works very well, but rolling back to every point in time where a change happened to the view is practically impossible to do using the API. This is because 1) not everything has an Audit record, so you can't use the audits, and 2) the audit records are purged, 3) there is a special feature to "play back" the history of the view to generate the listener events (requires MPX), but this will miss many events, 4) when items are shared, configured, branched, etc., these do not generate any audits in the project, 5) even if they did, getting every single change requires iterating through the view history down to the second to get all of the changes by analyzing the differences. So that means if your project has been active for a month and each time you analyze the two view configurations to diff them it takes 5 seconds, then actually migrating your project would take 5 months, and in the meantime it would be locked down.
So the next way to do this would be to establish "baselines" to compare. Using build labels is a good starting point if you have nightly or continuous builds in your project, or even just certain builds that were QA certified. This way you can use these baselines as points for diff/compare and then bring in the history that way. While this isn't as granular as a full history, it is by definition getting the most important differences to migrate over.
However, keep in mind that even doing it this way does not maintain the links between branch/merge points across the different branches/views. The only way to do this would be to go directly into the StarTeam database to get this information.
I went through all of these steps to try and write my own suite of tools to migrate from StarTeam to Subversion. It was fun and interesting and imperfect but had some promise, but ultimately never finished it. Part of the reason was because the time involved would have been far more than the value I got from it.
Which brings you inevitably to the most important question: what is the business value of maintaining full history? After going through this many times with project teams as a StarTeam administrator, more than 90% of the time it was readily apparent that the better approach is a cut-over. Make a time where you can begin working on new work in the new system and freeze work in the old system. It usually can be done with very little down time for the project team. You can even start by bringing over a history of Production releases to create a rough timeline in the new system. Use your existing comparison tools, either in TFS or BeyondCompare or elsewhere, to reproduce each state of your project source code, doc, etc. and reconcile it with your TFS project by checking in or deleting files as needed, and labeling your TFS project for each build you bring over. Line up all your TFS builds, work items, users, roles, etc, and make sure everything is ready. Then at time of cut-over, take the latest development snapshot from StarTeam over and do one more update to your TFS project. Lock your Starteam users out of the project (for checkins anyway), and begin working in TFS. Your TFS project will have a rough history of the most significant baselines and you will be able to keep your StarTeam repository open to users in case more history is needed.
One other thing to consider is how to create a permanent archive of your project. If your repository is small enough it is doable, but gets more time intensive the larger your project is. First, copy your entire database and vault to a separate instance and get that copy up and running. Then delete all other projects EXCEPT the projects you want to archive off. Run an online purge and make sure to run it to completion. You may need to restart your server and purge several times. When you are done, your entire repository should contain only the files and database records that are needed by your project. At this point you can back up your database and vault and keep them indefinitely. This reduces the size of your existing StarTeam repository.
Haven't used StarTeam in over 3 years but that was a fun ride back in time. Hope you found it useful.

Upgrade Magento while keeping all order data, sales data, ect

Is there a definitive way to upgrade Magento while keeping all order data, sales data, ect?
After setting up a staging environment and ensuring all extensions, ect. are working properly there will inevitably be orders placed, orders processed, ect. Is there a proper way to sync the data with the production installation?
An alternative way I've read about is to set up a staging area and essentially practice the upgrade while taking notes on all the fixes that need to be completed. Then put the site into maintenance mode and do the upgrade on in the production environment. Is this a viable solution?
Any insight is appreciated.
Dane
With most magento updates you just need to ensure that you are pointing to the same database in your local.xml file. Magento will run all updates needed on that database to get it up to speed automatically. Yay!
However. Back up that database before the update runs. If you have any issues, you want to make sure you have a clean version.
The update often uses a lot of memory if you have many items/orders. So make sure your .htaccess file (which might have been overwritten by the update) still has a good amount of memory allocated.

Syncing Joomla between Dev and Prod servers?

I'm curious how other people have approached this. Our group has been given the directive of implementing an internal website utilizing Joomla. We've set up a dev server for the person who is responsible for maintaining the site, and a production server. We're using IIS and the current version of Joomla.
I can sync the two with Akeeba Backup Core and Kickstart, but it seems an "All-or-Nothing" choice. It works, but if she's doing work on, for example, the look and feel of the site, but just wants to sync content, that doesn't appear to be doable.
I feel that someone out there must have tackled this goal before, but web searches seem to turn up people running dev/prod on the same server but in different subdirectories, or ignore the "all-or-nothing"ness of the issue, going for the "Do all at once" approach, which doesn't seem practical. Content changes frequently, but not-so the look/feel.
We've been doing this for several years now. We use a dev server and a prod server. When we make content changes on dev, we use phpMyAdmin to copy the content table from the dev db to the prod db. In some respects, it's still an all-or-nothing approach, because we have to copy the entire content table at once. This means you can't have some pages still in development when you do the copy. In other respects, it still a piecemeal approach, because we can copy individual tables such as modules, menus, etc. But again, it's ALL modules at once, ALL menus at once, etc. There is a way in phpMyAdmin to copy an individual page or item from a table in dev and put it in the corresponding table in prod, but it's a little cumbersome. It works, though.
As for design elements (images, css, template changes, etc.) we do the same thing, but the copying is done manually by ftp from one server to the other. Obviously this is the same method for things like pdf files on dev that need moved to prod.
In summary, this method has worked fairly well for us for a long time. But it's limitation is that you must realize you're copying an entire table at once.
The positive of all of this is that when we have pages that are in development, I have leverage over the content people to hurry and finish their work because one unfinished page can hold up the entire site!
This workflow dilemma has come up a few times for me.
You mention changes to look and feel, and that is simpler really, if it is just template changes. It is quite simpler to pull down an Akeeba Backup of the live server, kickstart it onto a local server, work on the template files, and then upload the updated template files to the live server.
That said, if it is more than CSS and HTML tweaks to existing files, it can be a more involved process.
Personally I've not found a silver bullet for this sort of thing, but with some forethought and planning it is not too bad.

Magento: multiple store launch one by one

I have developed 9 multiple websites/stores in one magento installation. On DEV server it is ready and working good. Now my client wants to launch first 3 websites in next month, then after we will make some modifications in other websites based on feedback from first 3 websites. Then launch other websites one by one.
One thing which I am worried about is how we make modification and further development. Because magento files and db is one installation, if we launch first three which means we are launching whole system, and if we make DEV work in live site that is not good because if anything breaks etc. LIVE sites will go down.
What is the best and logical procedure in magento to launch multiple stores one by one. What approach we can follow for such situations.
Please help, thanks!
To my eyes, the fact that you have 9 different websites/stores running off the install is of minor importance. You will encounter the same issues that any dev / live Magento set-up will encounter. At the point the site launches you will need to create a second copy of your database / code for use as a development environment. With regards to the code, I would hope that you are using some kind of VCS such as Git or SVN, if you aren't you should seriously consider it.
The database is the slightly trickier side of things. It is also going to be the issue exacerbated by the fact you have 9 different websites, since you will have a lot of different configurations. There will likely be 3 different types of configuration changes to be made.
1.) A setting that needs changing for the live websites.
2.) A setting that needs changing for a future website to be launched.
3.) A setting that needs changing in order to make your development site work.
The 3rd type is the easiest to deal with. You can simply change them in the database and forget about them. These will include things like setting the base_url values in core_config_data.
The 2nd type should ideally be made with migrations rather than through the UI. If you are using Source Control these migrations would be kept in a branch that will get merged into your master branch at the point you wish to launch the website they effect (at the point the code is merged, you may have to do some tinkering with the version numbers based on how you deal with type 1.
The 1st type can be handled in one of two ways, a migration is a favourable option as it means all installs of you code dev / staging / live can be kept in sync. If needs be, simply ensuring you update your dev database at the same time as the live one, would suffice.
Some of the things you need to change won't necessarily be the easiest things to achieve through migrations, but doing so should prevent any errors from arising whereby you forget to update a single value on one of your servers.

Resources