Migrate AutoML translation's custom model to another GCP Project - google-cloud-automl

I have some custom models for Google AutoML Translations trained in a QA GCP project.
I want to promote the models to production but I don't find a way to export/migrate the models from QA to the PROD project.
Is it possible that the only solution is to train again from scratch models? I want to avoid this solution because the training sets are exactly the same between QA and PROD and the training process is not cheap.
Thank you in advance to anyone who will help to figure out a possible solution.

Related

Best way to open an internal project to world as a Saas

We have developed an internal crm and used it for the last months. Now we have decided to open it to the public as a Saas project and I'm wondering which is the best solution to upgrade the database structure that actually is made for only one company and expand it to be able to manage multiple paying customers.
At the moment the scheduled solution is to add a "customer" field to every column in the database and upgrade the backend logic to use this field.
Are there more elegant solutions to this problem?
The database is mySql and the backend is made with laravel.
CRM Data can be very sensitive and you need to be extremely careful not to "leak" data to wrong customers.
For an existing app, I would argue for a system to create fresh DB for each customer.
You would have 1 codebase that connects to customer specific DB.
This way you dont need to change too much in your current DB structure, but "just" implement the mechanism to use the correct DB according to customer account.
This is how I would do it :
In any wah this is a massive paradigm change from an internal app to a SAAS platform app, and you should identify the necessary steps to go through to achieve the desired result.

Power Automate Flows

I have been learning to use flows in Power Automate and noticed that my flows are different from most videos I have tried to follow. For example, I have under my flows option: Cloud flows, Desktop Flows, Business process flows and shared with me, whereas videos I have been watching have their options as cloud and business (same as I have) and Team flows and UI flows which I do not have. Would I be correct in thinking that the flows have just been renamed, and if so would someone be able to point out which ones changed to which?
Many thanks.
I have tried to scour the net with no luck in finding out anything about a rename.
Regards
Ross
As you have mentioned the terminologies regarding the flow names have changed. I have been trying to search the net for a proper document about the name changes but couldn't find any. To answer your question:
Team flows are now shared with me & UI flows are desktop flows
Hope I have answered your question
Regards,
RT

How to change the configuration to build as single deployment with multi-database?

I came a cross this boilerplate just couple of weeks ago and it's the coincidence that I was working on designing Multi-tenant Saas Architecture using .Net Core Framework, EFCore as ORM, SPA(Angular)as presentation layer, and OData Api but then I found this boilerplate is exactly what I am looking for. I have one question how to set-up configuration on this sample Event Saas app to make it as Single Deployment with multi-database?
I have noted there is appsetting.json where subdomains are stored and in each entity for example Event is inherited from IMustHaveTenant that means each entity should have TenantId means this setup is suitable for single deployment with single database (database filter is automatically applied by aspboilerplate) but I am looking to make it single deployment with multiple database (Each database per tenant). It will be great if you just give me some clues.
Note: This is what I am talking about.
Thank you.

Architectural advice to developing Service Portal Application

I am new in Service Now platform, developing a custom app using the service portal and I am looking for some architectural advice from experts.
My storyline is my service is gonna serve different companies as per their requirements by easy codebase maintenance. for example, I am having a base app which has some concrete requirements that fit for all companies, but there will be some other features for company specific, like feature A for company A, feature B for company B and so on. So my initial plan was like classic software development that is to have a single codebase using git that will have multiple feature branches that will deploy to multiple instances. But sometimes some situation where I might need to merge the branches that is not possible now. My question here is there any alternative way to do that? Other possible scenarios here is should I go with a single instance with ACL based data separation? (but that not feels scalable to me cause the amount of the data will be huge after some time) or is that possible to apply regular SAAS architecture like multitenancy(single app with multiple databases) with some configuration wise feature separation?
Thanks in Advance.

Training Model for Each Individual in AzureML

I want to train an ANN model for each individual, in azure ml. For example, there is an application which wants to learn the behavior of each individual separately. How is this possible in azure-ml? Any suggestion?
As I know, I can create a model and train it with some data, but I don't know how can I train it specifically for each user. I should mention that I am seeking for a scalable idea which is applicable for a real situation (might be for 100 thousands users).
I highly recommend the Create many Machine Learning models and web service endpoints from one experiment using PowerShell article on this topic. It uses Azure ML PowerShell to automate creation of web services that have identical structure but user-specific trained models. Your application would need to keep track of the correspondence between web service and user.

Resources