I'm all around the net trying to figure which is the best currently available subsonic 3 provider for Oracle.
Any ideas?
As far as I know, my fork on GitHub is still the "best" (or most complete) for use with Oracle ODP.NET.
https://github.com/rally25rs/SubSonic-3.0
https://github.com/rally25rs/SubSonic-3.0-Templates
(specifically OracleDataProvider.ttinclude)
The stuff in my fork was slated to go into the next major release of SubSonic, but the entire project sort of lost steam, and development has pretty much stopped at this point, so a pull request was never made to get the Oracle and IBM DB2 support put back into the "main" SubSonic project.
However, the stuff that is in my fork, I used for over a year in a production application, before we moved off Oracle, so it definitely works.
It has been a while since I pulled any changes from the main SubSonic repository, so there might be bug fixes or other template changes in the main repositories that aren't in my fork.
Related
These are more high level questions, but our team is new to developing with APEX. We are currently a team of 3. We are using APEX 19.2, but are planning to upgrade up to 21.1
How do others handle the development flow, versioning, and releases of their projects when working in APEX?
We put majority of the business logic (validations, source sql, and process handling) in packages. So I feel it is pretty easy to version those files as they are outside of the APEX UI and can version in Git accordingly, but how do others version control all the APEX UI changes (pages, regions, items, DAs, etc...)?
I've searched and haven't really stumbled upon best practices of how teams, that use APEX, conduct their development process. One thing I'm nervous about is branching changes inside APEX UI. Sometimes we are given a requirement (say A) and we are asked to hold that release, but start working on requirement B. We may even release B before A gets final approval.
Are there any best practices, to ensure that developers working in the same workspace do not collide with others work? Luckily most of our project tasks do not overlap, but curious how others handle this
Any links or tips to this would be appreciated as we are new to APEX and trying to work these things out up front.
I'm probably not the right person to answer as my Apex team consists of 1 (one) member - me.
However:
We put majority of the business logic (validations, source sql, and process handling) in packages.
Me too, I found that to be the best option. Keep as little code on pages themselves as possible.
As of team development, did you read Managing the Application Life Cycle with Team Development?
Tracking Features might be particularly interesting for you. For example, it contains
Approval status of the feature. Indicates if the feature is to be implemented and the current progress.
which sounds like what you asked for.
I hope that someone - who really works in a team - will see your question and answer; I'd be interested in reading about their experiences and suggestions as well.
When we export application as zip , we can see there is folder structure. So in git we have to follow the same directory structure so it is easy to review and for versioning.
Thanks,
Nagaraju P
Recently I saw that mgo was no longer going to be maintained and I have a recent project with mgo. My question is if there is a problem with that? There are no risks?
Basically you may continue to use it, but since it's not maintained anymore, that means bugs discovered in it will not be fixed, and new features of MongoDB servers will not get added to it.
If you read the README of the github project (https://github.com/go-mgo/mgo), it lists your options.
The first suggests to use the community supported fork: github.com/globalsign/mgo. This is maintained, support for new features are being added, and it has the same API as the original package.
Since globalsign/mgo has identical API, there is no reason not to switch to it. It will most likely only take to change your imports.
Also note that there is an official MongoDB Go driver under development, it was announced here: Considering the Community Effects of Introducing an Official MongoDB Go Driver. It's project and source code is available here: github.com/mongodb/mongo-go-driver. It's currently in alpha phase, so it's nowhere near production ready (and they don't even have an estimated date when it will be ready). If you need a driver now, globalsign/mgo is the best option at the moment.
Do note that both the official driver and globalsign/mgo are getting support for newest features and additions of the MongoDB server, as an example, both support change streams (it wasn't in the original mgo driver). For details, see: Watch for MongoDB Change Streams
There will be problems if:
You want to get some new features in mongodb and current mgo library doesn't support
There is a bugs/security issue in mgo library.
That's one of the reason why i'm not using mgo.
There will be an Official MongoDB Go Driver.
GitHub: mongo-go-driver
Forum: mongodb-go-driver
Considering the Community Effects of Introducing an Official MongoDB Go Driver
I have a working instance of Orbeon Forms 4.3, which I setup to connect to Oracle. After upgrading to Orbeon Forms 4.4, connections to Oracle fail with different exceptions depending on the case, for instance:
org.orbeon.oxf.xforms.submission.XFormsSubmissionException
org.orbeon.oxf.common.ValidationException
org.orbeon.oxf.webapp.HttpStatusCodeException
What could be causing this?
Checking the source of the errors
Those exceptions are most likely only the symptom of the real cause. To be sure, change the your log4j.xml and properties-local.xml per the development configuration, reproduce the problem, and check again your orbeon.log. Check if in the log you see now see an java.lang.NoClassDefFoundError: oracle/xdb/XMLType exception.
Solution
If you see that NoClassDefFoundError, you need to add the xdb.jar and xmlparserv2.jar to the directory where you currently have Oracle driver. Those two files are part of the Oracle driver, and come along the other "main jar", i.e. ojdbc6_g.jar. On Tomcat, you typically place those files in Tomcat's lib directory. Restart Tomcat, and your problem should be solved.
Why this is happening
This is happening because Orbeon Forms 4.4 started sending XML "as XML" to Oracle, instead of "as CLOB". Because of this, the driver uses classes from xdb.jar and xmlparserv2.jar. In retrospect, this might not have been the best approach; it was done because it looked like the "right thing", but it isn't clear this has any performance or other benefit, and makes installation and upgrade harder, requiring additional jars.
Update
This issue is fixed in Orbeon Forms 4.5.
I am not even sure how to ask this question. I am absolutely willing to research this myself, but I don't even know what exactly my options are.
I'm fairly new to programming in general, and I'm the sole developer on an ASP.NET MVC3 web application. We're about to upgrade to a new version which has a lot of addition to the data model. There are a couple new entities and some of the old entities have new properties/columns.
We've finished beta testing and now we're going to try to get everyone moved over to the new version running parallel to the current version, that way if there are show-stopping problems, users can easily switch back to the old version. The problem is that we can't hook both up to the same db because of the data model differences.
Can I make the old version use the new version's schema or something? I'm not really sure what my options are. I'm not asking you to write this for me; I'm just looking for some direction. Thanks!
You should be able to disable the metadata checks and then use two versions against the DB assuming the models use a schema that is compatible between both.
http://revweblog.wordpress.com/2011/05/16/ef-4-1-code-first-disable-checking-for-edmmetadata-table/
Another option is to use entity framework 4.3 code first migrations and actually use an upgrade script that it will generate for you. If it fails you can roll back the script to a prior version and use your prior code base. This would imply you upgrade to 4.3 first before doing anything else though although you could still disable metadata checks.
I noticed in the Magento Certified Developer Study Guide, under the Database section one of the items mentioned is "Write downgrade (rollback) scripts".
I've done some searching to see whether downgrade scripts are supported and it seems they are not. I found this thread from earlier this year in which it seems they concluded that downgrade scripts weren't supported at that time.
Also, did some searching on google and found this article discussing what appears to be some initial support for rollback scripts in the core.
I also searched under app/code/core/Mage for "rollback" and "downgrade" and pretty much most of what I found was code related to DB transaction rollbacks.
Why would the study guide be talking about this if it's not supported? I must be missing something.
Current versions of Magento have no implementation for rollback database migration scripts, where rollback means identifying that a module version number has decreased and running an appropriate script.
Remember through, you're looking at a study guide, not a manual.
While there's no support for formal rollbacks in the current version of Magento, as a Magento developer you may need to rollback database changes made in a previous module upgrade. I'd be ready for questions that describe that scenario, with answers that test your knowledge of existing Magento functionality.
It's here:
Mage_Core_Model_Resource_Setup::applyUpdates() Available, at least, from Magento 1.3.