I noticed in the Magento Certified Developer Study Guide, under the Database section one of the items mentioned is "Write downgrade (rollback) scripts".
I've done some searching to see whether downgrade scripts are supported and it seems they are not. I found this thread from earlier this year in which it seems they concluded that downgrade scripts weren't supported at that time.
Also, did some searching on google and found this article discussing what appears to be some initial support for rollback scripts in the core.
I also searched under app/code/core/Mage for "rollback" and "downgrade" and pretty much most of what I found was code related to DB transaction rollbacks.
Why would the study guide be talking about this if it's not supported? I must be missing something.
Current versions of Magento have no implementation for rollback database migration scripts, where rollback means identifying that a module version number has decreased and running an appropriate script.
Remember through, you're looking at a study guide, not a manual.
While there's no support for formal rollbacks in the current version of Magento, as a Magento developer you may need to rollback database changes made in a previous module upgrade. I'd be ready for questions that describe that scenario, with answers that test your knowledge of existing Magento functionality.
It's here:
Mage_Core_Model_Resource_Setup::applyUpdates() Available, at least, from Magento 1.3.
Related
Recently I saw that mgo was no longer going to be maintained and I have a recent project with mgo. My question is if there is a problem with that? There are no risks?
Basically you may continue to use it, but since it's not maintained anymore, that means bugs discovered in it will not be fixed, and new features of MongoDB servers will not get added to it.
If you read the README of the github project (https://github.com/go-mgo/mgo), it lists your options.
The first suggests to use the community supported fork: github.com/globalsign/mgo. This is maintained, support for new features are being added, and it has the same API as the original package.
Since globalsign/mgo has identical API, there is no reason not to switch to it. It will most likely only take to change your imports.
Also note that there is an official MongoDB Go driver under development, it was announced here: Considering the Community Effects of Introducing an Official MongoDB Go Driver. It's project and source code is available here: github.com/mongodb/mongo-go-driver. It's currently in alpha phase, so it's nowhere near production ready (and they don't even have an estimated date when it will be ready). If you need a driver now, globalsign/mgo is the best option at the moment.
Do note that both the official driver and globalsign/mgo are getting support for newest features and additions of the MongoDB server, as an example, both support change streams (it wasn't in the original mgo driver). For details, see: Watch for MongoDB Change Streams
There will be problems if:
You want to get some new features in mongodb and current mgo library doesn't support
There is a bugs/security issue in mgo library.
That's one of the reason why i'm not using mgo.
There will be an Official MongoDB Go Driver.
GitHub: mongo-go-driver
Forum: mongodb-go-driver
Considering the Community Effects of Introducing an Official MongoDB Go Driver
I know that the 5.0 release note say "After the migration, source syntax-highlighting won't be available on a project until it has been successfully analyzed"
BUT, i can't imagine that there is no way to activate just by running another analysis. In fact, when you have thousands of components (it's our case), you can't plan 4500 analysis just to "restore" a basic but helpful functionality ! And it's more true when you know that the majority of theses components wasn't changed since a time ago... :(
So, please, say me that we can write a little batch or program that will do the job without need to pull all the sources ! I don't know how because i don't' understand this limitation of this upgrade (why sources aren't accessible)
You should trust the release notes. Information required for syntax highlighting is computed during analysis. Note that it also requires the language plugins to support this feature. I suggest to upgrade them to latest versions.
I wanted to try STXXL to find how efficient it is in reading a big data file from the disk.
So i setup the enviornment for using it.
Then i ran this program http://algo2.iti.kit.edu/dementiev/stxxl/tags/1.2.1/algo_2sort__file_8cpp-example.html in VS2010. However the file data was not mapped to the vector_type, in fact it deleted the contents of the file after this statement - vector_type v(&f);
I tried changing from stxxl::file::RDWR to stxxl::file::RDONLY, this time the file content was not deleted, however still the vector_type variable was empty.Request your support to proceed further.
Also, is STXXL used widely in commercial applications?
Best Regards,
Ramki.
You are running a code example from STXXL version 1.2.1, which version have you installed on your system?
Most up-to-date version is "Development 1.4" which comes with many improvements, a comprehensive documentation with a lot of short code examples and runs pretty well - check the official STXXL Website under "Downloads and Documentation". Using version 1.4 is highly recommended.
Please check if your problem still exists on the new "Development 1.4" version. The Installation Process has become much easier - read the Installation and Configuration Part of the Documentation at first.
The official webpage provides a (certainly incomplete) list of Publications,Ongoing and Completed Projects using the STXXL successfully - there is no reason why not using it in an commercial environment.
I am not even sure how to ask this question. I am absolutely willing to research this myself, but I don't even know what exactly my options are.
I'm fairly new to programming in general, and I'm the sole developer on an ASP.NET MVC3 web application. We're about to upgrade to a new version which has a lot of addition to the data model. There are a couple new entities and some of the old entities have new properties/columns.
We've finished beta testing and now we're going to try to get everyone moved over to the new version running parallel to the current version, that way if there are show-stopping problems, users can easily switch back to the old version. The problem is that we can't hook both up to the same db because of the data model differences.
Can I make the old version use the new version's schema or something? I'm not really sure what my options are. I'm not asking you to write this for me; I'm just looking for some direction. Thanks!
You should be able to disable the metadata checks and then use two versions against the DB assuming the models use a schema that is compatible between both.
http://revweblog.wordpress.com/2011/05/16/ef-4-1-code-first-disable-checking-for-edmmetadata-table/
Another option is to use entity framework 4.3 code first migrations and actually use an upgrade script that it will generate for you. If it fails you can roll back the script to a prior version and use your prior code base. This would imply you upgrade to 4.3 first before doing anything else though although you could still disable metadata checks.
I'm using Magento 1.4.1.1 for my webstore. The payment processor supports only 1.4.0.0. I realized this only just now when I was dreaming up of opening the store. Duh! Poor planning.
What's the way out?
Will downgrading help? Wat are the implications of that?
Thanks for any and all inputs.
I am not aware of anyone ever having successfully downgraded Magento. That said, a few considerations:
Are you using version control like you should be? If so, you should have a copy of the site and database from just before the upgrade. You should be able to use this as a starting point. This is your most optimistic route by far.
If no version control, you can download both of the versions and use diff to get the changes. Doing this in reverse theoretically creates a backwards patch.
If you've stayed out of the core code entirely, the code change could be nearly as simple as replacing app/code/core.
Even if you do downgrade the code, the data structures between versions have probably changed, so you'll need someone experienced to find those changes and tell you have to back-patch your database. This is, to say the least, perilous.
Overall, I wouldn't want to undertake this task. As Anton said in the comments, you'll probably have an easier time getting integration done than reverting the changes.
Best of luck!
Thanks,
Joe