Magento Custom Module with EAV - Fails creating text table - magento

I'm trying to create a custom module using Magento's EAV structure and have been using this tutorial as a guide: http://www.magentocommerce.com/knowledge-base/entry/magento-for-dev-part-7-advanced-orm-entity-attribute-value
Everything seems to be working except my install script is always failing on trying to create the _text table. It creates the first 3 - entityname_datetime, entityname_int and entityname_decimal and then throws the exception: Can't create table: entityname_text from the core resource setup model.
I've tried changing the table name but it fails at the same point. Except for changing the model and entity names, my code exactly mirrors that from the tutorial.
Any tips for debugging? Using Mage::log within the core resource setup model doesn't seem to be working, though I can't fathom why.

This is bug that was recently introduced to the Community Edition version of the product. It's been reported to the core team via their public-with-a-login bug tracker. For now I'd skip creating non-varchar text types and continue working your way through the tutorial. (not ideal, but as the tutorial suggests, you'll it's a rare use-case that calls for a scratch EAV model)

after changing table name have u tried removing entry from core_resource table of your module and clearing the cache? btw, what name you are using?

solution by Magento Team bug report reference link
class Namespace_Modulename_Model_Resource_Setup extends Mage_Eav_Model_Entity_Setup
{
public function createEntityTables($baseTableName, array $options = array())
{
...
/**
* DDL operations cannot be executed within transaction so these lines are useless
*/
//$connection->beginTransaction();
try {
foreach ($tables as $tableName => $table) {
$connection->createTable($table);
}
$connection->commit();
} catch (Exception $e) {
//$connection->rollBack();
throw Mage::exception('Mage_Eav', Mage::helper('eav')->__('Can\'t create table: %s', $tableName));
}
}
}

Related

Assert model was not made searchable

I'm building a system to manage some articles for my company using Laravel and Laravel Scout with Algolia as the search backend.
One of the requirements states that whenever something in an article is changed, a backup is kept so we can prove that a certain information was displayed at a specific time.
I've implemented that by cloning the existing article with all its relationships before updating it. Here is the method on the Article model:
public function clone(array $relations = null, array $except = null) {
if($relations) {
$this->load($relations);
}
$replica = $this->replicate($except);
$replica->save();
$syncRelations = collect($this->relations)->only($relations);
foreach($syncRelations as $relation => $models) {
$replica->{$relation}()->sync($models);
}
return $replica;
}
The problem is the $replica->save() line. I need to save the model first, in order for it to have an ID when syncing the relationships.
But: The only thing preventing scout from indexing the model is if the model has its archived_at field set to any non-null value. But since this is a clone of the original model, this field is set to null as expected, and is only changed after the cloning procedure is done.
The problem: Scout is syncing the cloned model to Algolia, so I have duplicates there. I know how to solve this, by wrapping the clone call into the withoutSyncingToSearch (https://laravel.com/docs/5.6/scout#pausing-indexing) callback.
But since this is rather important and the bug is already out there, I want to have a unit test backing me up that it was indeed not synced to Algolia.
I don't have any idea how to test this though and searching for a way to test Scout only leads to answers that tell me not to test Scout, but rather that my model can be indexed etc.
The question: How do I create a Unittest that proves that the cloned model wasn't synced to Algolia?
At the moment I'm thinking about creating a custom Scout driver for testing, but it seems to be a total overkill for testing one single function.

Make the entire symfony app read-only

I need to set up a live demo of a Symfony app.
How can I make everything read-only? The users should be able to try all the features but not make any persistent change visible to others.
I could remove the INSERT and UPDATE privileges to the mysql user, but that would be an ugly error 500 when they try to save something...
Quick and dirty way to make your entire app Read-Only.
AppBundle/EventSubscriber/EntitySubscriber.php
namespace AppBundle\EventSubscriber;
use Doctrine\Common\EventSubscriber;
use Doctrine\ORM\Event\PreFlushEventArgs;
class EntitySubscriber implements EventSubscriber
{
public function getSubscribedEvents()
{
return [
'preFlush'
];
}
public function preFlush(PreFlushEventArgs $args)
{
$entityManager = $args->getEntityManager();
$entityManager->clear();
}
}
services.yml
app.entity_subscriber:
class: AppBundle\EventSubscriber\EntitySubscriber
tags:
- { name: doctrine.event_subscriber, connection: default }
I suppose that you've already made it. But if not:
Use dummy database. Copy it from original DB. Let them play. Drop it when you don't need it.
If you have no access to database creation and drop you can still do the trick. Just add temporary prefixes to table names in Doctrine entities. No need to rewrite the entire app, just a few lines. Run migrations to create new tables. Drop them whenever you want later.
Use virtual machine. Make a snapshot before the show. Roll back to the saved snapshot after the show.
These are more or less easy ways and they are platform independent.
Changing this based on the Symfony app level might have one of two disadvantages. You either do not save anything and thus your demo is not working so nice to show it to the customer. Or you have to do to much manipulations with the code and throw this huge work away right after the show.
Maybe you can use Session to do that or Memcache that you can implement in Symfony (Some examples are available on the web). Hope this will help.

Laravel created_by/modified_by relations

I was trying to get this working in a typical belongsTo relation. However it keeps saying that the column is not set in the model, even if looking in the actual database it is there.
I have tried to look at the source code as well as try many approaches to bypass this issue, however nothing seems to do anything.
public function modifiedBy()
{
return $this->belongsTo('\Modules\Users\Model\User', 'modified_by');
}
public function createdBy()
{
return $this->belongsTo('\Modules\Users\Model\User', 'created_by');
}
This is the code inside the model, I use PSR-0 to define modules, better splitting up logic (no issues with that) but using this it would give an error of
Undefined property: \Modules\Module\Model\CurrentModel::$modified_by
This is coming from a seed to push some initial info into the database.
$user = Sentinel::findById(1);
$model = new CurrentModel;
$model->modifiedBy()->associate($user);
$model->save();
This is basically how it goes together, I have tried for some time to figure out what is wrong but I am calling blanks. Any ideas?
Found out a solution. Not a fix though but I would consider this an issue with laravel so I may look into adding it as a bug report (although this could be fixed in laravel 5?).
Basically with modified_by I need to define the column it is using and not let laravel automatically generate it (in order to do this "cleanly"). However the "bug" (only calling it a bug as currently I can only see this as an unintended problem) makes it so you cannot define the column it will be using, you have to let laravel decide it for you.
So I changed the functions to look like this:
public function modifiedby()
{
return $this->belongsTo('\Modules\Users\Model\User');
}
This makes laravel assume the column is modifiedby_id, but by changing my migrations to reflect that there was no more error.

MVCScaffolding and Database Sync for constantly-changing model

I have used MVCScaffolding from Nuget Package Manager and followed the brief tutorial on how it works.
It seems simple enough,and when I run
Scaffold Controller Team –Repository -Force it will create all the repository pattern stuff surrounding "Team".
However, in an attempt(and success) to break this, I decided to add in an additional field to the "Team" class (myRandomField).
As I expected, when I compiled I got an error in the MVC View which was:
The model backing the 'MvcApplication1Context' context has changed since the database was created. Either manually delete/update the database, or call Database.SetInitializer with an IDatabaseInitializer instance. For example, the DropCreateDatabaseIfModelChanges strategy will automatically delete and recreate the database, and optionally seed it with new data.
Obviously this error is because I have updated the model (Code-first??) and the DB is now 'out of sync' with my model.
What is the best approach to get around this issue? Is there an easy way to have to DB sync with the model - I plan on doing a lot of editing to my models as the project I am starting will be rolled out gradually (So doing a complete database rebuild each time is out the question) Is code first the right approach for me in this case? I really like this plugin/tool would be a shame not to use it.
jad,
as mentioned in my comment above, if you're 'happy' to lose all exisiting data in your DB, then you can add the following into your global.asax:
[Conditional("DEBUG")]
private static void InitializeDb()
{
using (var db = new YourContext())
{
// double indemnity to ensure just sqlserver express
if (db.Database.Connection.DataSource != null
&& db.Database.Connection.DataSource.IndexOf("sqlexpress",
StringComparison.InvariantCultureIgnoreCase) > -1)
{
// Initializer code here
Database.SetInitializer(new DropCreateDatabaseIfModelChanges<YourContext>());
}
}
}
and then call this from Application_Start(), i.e.
protected void Application_Start()
{
InitializeDb();
ViewEngines.Engines.Add(new MobileViewEngine());
AreaRegistration.RegisterAllAreas();
RegisterGlobalFilters(GlobalFilters.Filters);
RegisterRoutes(RouteTable.Routes);
}
if you wish to retain the data, then you'll have to use a data migration tool. I used the Red Gate tools (SQL Comparison Bundle) to perfom this. Basically, this look at your new schema and your old schema and migrates existing data over into the new schema, ready for test and deployment, all without touching the original db file.
I think this should work well for you.

EF Code First Migrations - how does it remember the previous model change?

So i'm using Entity Framework Code First Migrations.
I make a change to my model, add a new manual migration and it gets the up script wrong.
So I delete the migration, and at the same time that I am not going to change it the way I thought. Upon deletion of the migration class and a reset of the model (ie setting it back as it was) I then change my model again.
When I generate a new migration, this migration acts as If it is changing from the one that I deleted.
how does entity framework code first know the last model state if you clean and delete a migration?
And how do you reset this?
In your database, under "Tables / System Tables" (assuming you use SQL Management Studio), edit the Table __MigrationHistory.
Got me stumbled too, after I had deleted all migrations *.cs files and still VS "knew" about old migrations!!
You probably didn't delete the Designer file underneath it that contains information about automatic migrations up until that point.
http://msdn.microsoft.com/en-US/data/jj554735
Run the Add-Migration AddBlogRating command...
The migration also has a code-behind file that captures some metadata. This metadata will allow Code First Migrations to replicate the automatic migrations we performed before this code-based migration. This is important if another developer wants to run our migrations or when it’s time to deploy our application.
The code-behind is a file like 201206292305502_AddBlogRating.Designer.cs, underneath the manual migration class you created. It looks like:
public sealed partial class AddBlogRating : IMigrationMetadata
{
string IMigrationMetadata.Id
{
get { return "201206292305502_AddBlogRating"; }
}
string IMigrationMetadata.Source
{
get { return "H4sIAAAAAAAEAOy9B2AcSZ...=="; }
}
string IMigrationMetadata.Target
{
get { return "H4sIAAAAAAAEAOy9B2AcSZ...=="; }
}
}
​
Those 2 strings are base64 encoded dumps of your entire model prior to the migration and after it. The idea being that anything prior to the first manual migration logged was automatic, so when you apply all this to a fresh DB it can look and say:
Manual1
Manual2
Check Source to determine goal model before Manual1, apply using the Automatic approach, apply Manual1, check Source on Manual2, use automatic approach to get there, apply Manual2, finally use automatic approach to get from there to the current compiled model state.

Resources