Mlflow with Hugging Face - huggingface-transformers

I am trying to load models to mlflow, but I keep getting: NO ARTIFACT RECORDED.
I am using the attribute "callbacks = [MLflowCallback]" of the Trainer function, but when training, only the metrics are saved but not the artifacts.
Someone has managed to get artifacts loaded also using transformers.
Thanks in advance :)

Related

vertex-ai learn curve - batch predictions (label photos) without deployment to an endpoint?

--resolved -- by using cli to UNDEPLOY the model then following docs
https://cloud.google.com/vertex-ai/docs/image-data/classification/get-predictions#curl
to submit batch-prediction
vertex-ai newbie just trying to learn basics without alot of cost, using the vertex-ai dashboard to good effect ( training model and endpoint to recognize/ label photos using 3 categs from training labels)
but the problem i experience was that somehow , inadvertently thru the dashboard, the model was deployed and led to daily $30 charges to the project.
IMO - from the vertex-ai docs, you should be able to run batch-predictions WITHOUT having to deploy the model/ endpoint.
Is anyone able to verify the above?
with the current price lists, on a very small budget its possible to touch most of the bases ( create, train, test a model to label photos ) as long as you DO NOT deploy it. AND batch predicitions appear to be allowed with NO deployment.
The only costly SKU i ran into was the actual deployment see photo
--resolved -- by using cli to UNDEPLOY the model then following docs https://cloud.google.com/vertex-ai/docs/image-data/classification/get-predictions#curl
to submit batch-prediction
For the undeploy command details you can visit this documentation here.
Sample command:
gcloud ai endpoints undeploy-model 123 --project=example --region=us-central1 --deployed-model-id=456

Trying to upload the latest plugin for Google Cloud to Data Fusion but getting an error while uploading

Reference to this post that I had earlier: Possible to modify or delete rows from a table in BigQuery dataset with a Cloud Data Fusion pipeline? I am trying to do the suggested answer to compile the latest version of Google Cloud Platform plugin and upload to Data Fusion so I can use the latest features.
We have downloaded the code, compiled it and got 2 files:
google-cloud-0.13.0-SNAPSHOT.jar
google-cloud-0.13.0-SNAPSHOT.json
Inside the JSON file, the last lines for the parent artifacts were:
},
"parents": [
"system:cdap-data-pipeline[6.1.0-SNAPSHOT,7.0.0-SNAPSHOT)",
"system:cdap-data-streams[6.1.0-SNAPSHOT,7.0.0-SNAPSHOT)"
]
}
Initially I went to Data Fusion and choose to upload a new plugin, but I got an error about the parent artifacts not existing. So I did some digging and found out the version of used artifacts on Data Fusion currently to be 6.0.1:
So I modified the parent artifacts to the correct versions, and now the last few lines in the JSON file show:
},
"parents": [
"system:cdap-data-pipeline[6.0.1-SNAPSHOT,7.0.0-SNAPSHOT)",
"system:cdap-data-streams[6.0.1-SNAPSHOT,7.0.0-SNAPSHOT)"
]
}
When I try to upload the plugin again, it seems to pass the artifacts check step, but it fails on some sort of a class check and I see this in the upload screen:
Class could not be found while inspecting artifact for plugins. Please
check dependencies are available, and that the correct parent artifact
was specified. Error class: class java.lang.NoClassDefFoundError,
message: io/cdap/cdap/etl/api/validation/ValidationException.
So now I'm really lost about what's wrong here. I'm doubting that the artifacts version that is being used in Data Fusion does not have the class that is throwing the error? If so, how do I update the artifact itself?
Or if there is something else that I am missing in this whole process, then I would really appreciate any guidance or support on this!
Regards
You can try using the release/0.12 branch of the google-cloud plugins repo. That is compatible with the 6.0 version of Cloud Data Fusion.

Sonar plugin postjob measure

I have a sonar plugin that post a comment in gitlab when there is new sonar issues in a commit. I would like to add difference of code coverage and code duplication in the comment.
This is made by making a sonar plugin based on PostJob.
The issues are recovered by injection of ProjectIssues in the constructor.
Is there a way to recover the Measures in the post job? I saw that changed in sonar 5.2 but there is no real explanation on how to proceed to read measures.
Thank you for any help.
if the ProjectIssues or PostJobContext have no data you are looking for, you can use the web API rest:
WsRequest wsRequest = new GetRequest("api/...");
but beware the last measures will not be computed at the moment of #BatshSide, you have to wait for the Compute Engine to finish his work. So as you cannot wait in 'BatchSide', look for moving your plugin under #ComputeEngineSide

Veins/Omnett retrieve Traffic Light information

I am new to Veins and omnett and am working on a project that will use traffic lights to map out intersections. http://www.sumo.dlr.de/wiki/TraCI/Traffic_Lights_Value_Retrieval#Command_0xa2:_Get_Traffic_Lights_Variable shows that ID List can be retrieved but I am having trouble getting that information. I tried using the command: trafficlight(1).idlists; I included #include "veins/modules/mobility/traci/TraCICommandInterface.h". The error is showing up as "trafficlight was not declared in this scope". I am trying to call this function in BaseWaveAppLayer.cc. Any help would be great. Thanks!
The latest version of veins supports traffic light access and control.
The command you mention is implemented in SUMO, but not in Veins 4.4.
You will need to write your own method. For inspiration, you can refer to similar commands and how they are implemented.

The Entity you are trying to import is not the same as the one existing in the database even though it has the same name

when trying to import customizations for a specific entity I get an error saying that I can't reuse system queries for a custom entity.
The error in the title appears in the detailed CRM trace:
Could not import a Saved Query {C9771189-0CB3-E111-A93D-00505699001D} for ObjectTypeCode 10010 because this is a system Saved Query. The Entity you are trying to import is not the same as the one existing in the database even though it has the same name.
the id of the query is the one in the source customizations.
Both the source and target entities seem to have the exact same queries, with the same names.
Google hasn't been able to offer much on this.
Can anyone shed any light on the subject?
Is your source environment upgraded from v3.0? If so have you applied the hotfix (or latest rollup) mentioned in this KB article to both servers?

Resources