Purpose of "memory path" variable in ATestnetConsumer.sol example - chainlink

Hello I'm working to create my own custom chainlink job and associated consumer.sol contract. I am using the example "ATestnetConsumer.sol" contract as a guide. (https://github.com/smartcontractkit/documentation/blob/main/_includes/samples/APIRequests/ATestnetConsumer.sol) I am confused regarding the purpose of the req variable and the path string array. What is the purpose of these when the chainlink node executes the job?
function requestEthereumLastMarket(address _oracle, string memory _jobId)
public
onlyOwner
{
Chainlink.Request memory req = buildChainlinkRequest(stringToBytes32(_jobId), address(this), this.fulfillEthereumLastMarket.selector);
req.add("get", "https://min-api.cryptocompare.com/data/pricemultifull?fsyms=ETH&tsyms=USD");
string[] memory path = new string[](4);
path[0] = "RAW";
path[1] = "ETH";
path[2] = "USD";
path[3] = "LASTMARKET";
req.addStringArray("path", path);
sendChainlinkRequestTo(_oracle, req, ORACLE_PAYMENT);
}
Edit: I did figure out what the path variable is for, it controls how you want the jsonparse function to run. For example the path listed above will parse down like this:
{"RAW": {"ETH" : {"USD" : { "LASTMARKET" : value

The string[] memory path is the path for the chainlink node to walk to get data from the JSON responose.
For example, if your json looks like this:
{
cat: {
tabby: 7,
cool: 2
}
}
And you wanted the cool value of 2, your path would be:
string[] memory path = new string[](2);
path[0] = "cat";
path[1] = "cool";
req.addStringArray("path", path);

Related

Getting path of ArrayList<image> that is already set and convert to string

My program creates a "Deck" of cards in a List, I want to be able to get the image, find the path to parse it to get the information that is in the image name.
For this, I am using an ArrayList to store the cards.
List<Image> mainDeck = new ArrayList<Image>();
To load the image, I am using this code
public List load(List<Image> newDeck) {
count = 0;
for (int i = 0; i < 4; i++) {
for (int k = 0; k < 10; k++) {
newDeck.add(new Image("images/" + prefix.get(i) + "" + (k + 1) + ".png"));
count++;
}// end number card for loop
for (int l = 0; l < 3; l++) {
newDeck.add(new Image("images/" + prefix.get(l) + "" + prefixFace.get(l) + ".png"));
count++;
}// end face card for loop
}// end deck for loop
it then gets called and populated with images that work perfectly, I would like to create a matching array filled with Strings that hold the path for the matching Image array.
The names of images are "c1.png", "c2.png", etc, and I just need the number in the pathname
Once I get the array I should be able to parse the data to get the numbers.
Any help would be much appreciated.
when using get url, I am getting an error, here is that code
for (Image card : mainDeck){
String path = card.getUrl();
String name = path.substring(path.lastIndexOf("/")+1, path.lastIndexOf("."));
nameData.put(card, name);
it is not recognizing card.getUrl();
you don't have to create another arrayList for the paths cause the path data is already saved with the images.
if you want to retrieve the path of an image after creating it then you might use the method getUrl() in the Image class, calling getUrl() from an Image object will return the path that you used to create the image when you called the constructor, note that it will only work if you called the Image constructor using a String as the path for the image, it will not work if you used an inputStream to initialize your image, and also that it was defined in java 9, after getting the path you might split it to get the useful data you want, something like
Image image = new Image("file:/C:/Users/zinou/Desktop/edit.png");
String path = image.getUrl();
String name = path.substring(path.lastIndexOf("/") + 1, path.lastIndexOf("."));
System.out.println(name);
and if i were to associate every card with its path, i would use a hashMap as in
List<Image> mainDeck = new ArrayList<Image>();
//population of the list
HashMap<Image, String> nameData = new HashMap<Image, String>();
for (Image card : mainDeck) {
String path = card.getUrl();
String name = path.substring(path.lastIndexOf("/") + 1, path.lastIndexOf("."));
nameData.put(card, name);
}
If you have java 8
you can create a class that extends the Image class and give it an additional attribute (the URL), and add a getter for the URL so you can access it, but then your cards will be instances of the new class you created, so you can get the URL for them, the new class may look like this
public class MyImage extends javafx.scene.image.Image{
String url;
public MyImage(String arg0) {
super(arg0);
url = arg0;
}
public String geturl() {
return url;
}
}

get workflow malfunction exception with java api

Does anyone know how to get a workflow malfunction error message using the java pe api? I am running the QueueSample java code provided by IBM and it is not clear to me how to do this. Any help would be appreciated!
I found the malfunction error message for my workflow in the VWParticipantHistory.getLogFields() array. I modified the example code from the Developing Applications with IBM FileNet P8 APIs redbook:
// Create session object and log onto Process Engine
...
// Get the specific work item
...
// Get VWProcess object from work object
VWProcess process = stepElement.fetchProcess();
// Get workflow definitions from the VWProcess
VWWorkflowDefinition workflowDefinition =
process.fetchWorkflowDefinition(false);
// Get maps for each workflow definition
VWMapDefinition[] workflowMaps = workflowDefinition.getMaps();
// Iterate through each map in the workflow Definition
for (int i = 0; i < workflowMaps.length; i++) {
// Get map ID and map name for each map definition
int mapID = workflowMaps[i].getMapId();
String mapName = workflowMaps[i].getName();
// Get workflow history information for each map
VWWorkflowHistory workflowHistory = process.fetchWorkflowHistory(mapID);
String workflowOriginator = workflowHistory.getOriginator();
// Iterate through each item in the Workflow History
while (workflowHistory.hasNext()) {
// Get step history objects for each workflow history
VWStepHistory stepHistory = workflowHistory.next();
String stepName = stepHistory.getStepName();
System.out.println("step history name = " + stepName);
// Iterate through each item in the Step History
while (stepHistory.hasNext()) {
// Get step occurrence history
// objects for each step history object
VWStepOccurrenceHistory stepOccurenceHistory = stepHistory.next();
Date stepOcurrenceDateReceived = stepOccurenceHistory.getDateReceived();
Date stepOcurrenceDateCompleted = stepOccurenceHistory.getCompletionDate();
while (stepOccurenceHistory.hasNext()) {
// Get step work object information
// for each step occurrence
VWStepWorkObjectHistory stepWorkObjectHistory = stepOccurenceHistory.next();
stepWorkObjectHistory.resetFetch();
// Get participant information for each work object
while (stepWorkObjectHistory.hasNext()) {
VWParticipantHistory participantHistory = stepWorkObjectHistory.next();
String opName = participantHistory.getOperationName();
System.out.println("operation name = " + opName);
Date participantDateReceived = participantHistory.getDateReceived();
String participantComments = participantHistory.getComments();
String participantUser = participantHistory.getUserName();
String participantName = participantHistory.getParticipantName();
VWDataField[] logFields = participantHistory.getLogFields();
System.out.println("** start get log fields **");
for (int index=0; index<logFields.length; index++){
VWDataField dataField = logFields[index];
String name = dataField.getName();
String val = dataField.getStringValue();
System.out.println("name = " + name + " , value = " + val);
}
System.out.println("** end get log fields **");
} // while stepWorkObjectHistory
} // while stepOccurenceHistory
} // while stepHistory
} // while workflowHistory
} // for workflow maps

Tensorflow serving function using tf.Estimators causes error while calling from java

I have successfully created the model and wanted to export it to be used for prediction from java client but while invoking the prediction using prediction stub from java it errors out as i need to place the serialized example into a placeholder object while calling prediction!
You must feed a value for placeholder tensor 'input_example_tensor' with dtype string and shape [?]
if anyone can help me out in creating a tensorplaceholders using protobuff in java?
there is an error as below -
io.grpc.StatusRuntimeException: INVALID_ARGUMENT: You must feed a value for placeholder tensor 'input_example_tensor' with dtype string and shape [?]
[[Node: input_example_tensor = Placeholder[dtype=DT_STRING, shape=[?], _device="/job:localhost/replica:0/task:0/device:CPU:0"]()]]
at io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:221)
at io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:202)
at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:131)
at tensorflow.serving.PredictionServiceGrpc$PredictionServiceBlockingStub.predict(PredictionServiceGrpc.java:332)
My Signature Definition used is as below using saved_model_cli -
The given SavedModel SignatureDef contains the following input(s):
inputs['inputs'] tensor_info:
dtype: DT_STRING
shape: (-1)
name: Placeholder:0
The given SavedModel SignatureDef contains the following output(s):
outputs['classes'] tensor_info:
dtype: DT_STRING
shape: (-1, 2)
name: dnn/head/Tile:0
outputs['scores'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 2)
name: dnn/head/predictions/probabilities:0
Method name is: tensorflow/serving/classify
Please find the code below used in java to create a request object -
long start1 = System.currentTimeMillis();
HashMap<String, Feature> inputFeatureMap = new HashMap();
ByteString inputStr = null;
List<ByteString> inputList = new ArrayList<ByteString>();
HashMap<String, Object> inputData = new HashMap<String, Object>();
inputData.put("bid", Float.parseFloat("-1.628"));
inputData.put("day_of_week", "6");
inputData.put("hour_of_day", "5");
inputData.put("connType", "wifi");
inputData.put("geo", "0");
inputData.put("size", "Phone");
inputData.put("cat", "arcadegame");
inputData.put("os", "7");
inputData.put("conv", Float.parseFloat("4"));
inputData.put("time", Float.parseFloat("650907"));
inputData.put("conn", Float.parseFloat("5"));
for (Map.Entry<String, Object> entry : inputData.entrySet()) {
Feature feature = null;
String featureName = entry.getKey();
Object featureValue = entry.getValue();
if (featureValue instanceof Float) {
feature = Feature.newBuilder()
.setFloatList(FloatList.newBuilder().addValue(Float.parseFloat(featureValue.toString())))
.build();
} else if (featureValue instanceof String) {
feature = Feature.newBuilder()
.setBytesList(
BytesList.newBuilder().addValue(ByteString.copyFromUtf8(featureValue.toString())))
.build();
} else if (featureValue instanceof Integer) {
feature = Feature.newBuilder()
.setInt64List(Int64List.newBuilder().addValue(Integer.parseInt(featureValue.toString())))
.build();
}
if (feature != null) {
inputFeatureMap.put(featureName, feature);
}
Features features = Features.newBuilder().putAllFeature(inputFeatureMap).build();
inputStr = Example.newBuilder().setFeatures(features).build().toByteString();
}
TensorProto.Builder asyncReBuilder = TensorProto.newBuilder();
asyncReBuilder.addStringVal(inputStr);
TensorShapeProto.Dim idsDim2 = TensorShapeProto.Dim.newBuilder().setSize(inputList.size()).build();
TensorShapeProto idsShape2 = TensorShapeProto.newBuilder().addDim(idsDim2).build();
asyncReBuilder.setDtype(DataType.DT_STRING).setTensorShape(idsShape2);
TensorProto allReqAsyncProto = asyncReBuilder.build();
TensorProto proto = allReqAsyncProto;
// Generate gRPC request
com.google.protobuf.Int64Value version = com.google.protobuf.Int64Value.newBuilder().setValue(modelVersion)
.build();
Model.ModelSpec modelSpec = Model.ModelSpec.newBuilder().setName(modelName).setVersion(version).build();
Predict.PredictRequest request = Predict.PredictRequest.newBuilder().setModelSpec(modelSpec)
.putAllInputs(ImmutableMap.of("inputs", proto)).build();
// Request gRPC server
PredictResponse response;
try {
response = blockingStub.predict(request);
long end = System.currentTimeMillis();
long diff = end - start1;
System.out.println("diff:"+ diff);
System.out.println("Response output count is - "+response.getOutputsCount());
System.out.println("outputs are: - " + response.getOutputs());
System.out.println("*********************************************");
// response = asyncStub.predict(request);
System.out.println("PREDICTION COMPLETE>>>>>>");
} catch (StatusRuntimeException e) {
e.printStackTrace();
return;
}
NOTE: I have used and successfully exported the model using the following export function() -
def _make_serving_input_fn(working_dir):
"""Creates an input function reading from raw data.
Args:
working_dir: Directory to read transformed metadata from.
Returns:
The serving input function.
"""
raw_feature_spec = RAW_DATA_METADATA.schema.as_feature_spec()
# Remove label since it is not available during serving.
raw_feature_spec.pop(LABEL_KEY)
def serving_input_fn():
raw_input_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(
raw_feature_spec)
raw_features, _, default_inputs = raw_input_fn()
# Apply the transform function that was used to generate the materialized
# data.
_, transformed_features = (
saved_transform_io.partially_apply_saved_transform(
os.path.join(working_dir, transform_fn_io.TRANSFORM_FN_DIR),
raw_features))
serialized_tf_example = tf.placeholder(dtype=tf.string,
shape=[None] )
receiver_tensors = {'examples': serialized_tf_example}
return tf.estimator.export.ServingInputReceiver(transformed_features, receiver_tensors)
return serving_input_fn
Anyways i resolved it using a different serving export function()- as given below
def _make_serving_input_fn(working_dir):
"""Creates an input function reading from raw data.
Args:
working_dir: Directory to read transformed metadata from.
Returns:
The serving input function.
"""
raw_feature_spec = RAW_DATA_METADATA.schema.as_feature_spec()
# Remove label since it is not available during serving.
raw_feature_spec.pop(LABEL_KEY)
def serving_input_fn():
raw_input_fn = input_fn_utils.build_parsing_serving_input_fn(
raw_feature_spec, default_batch_size=None)
raw_features, _, inputs = raw_input_fn()
# Apply the transform function that was used to generate the materialized
# data.
_, transformed_features = (
saved_transform_io.partially_apply_saved_transform(
os.path.join(working_dir, transform_fn_io.TRANSFORM_FN_DIR),
raw_features))
return tf.estimator.export.ServingInputReceiver(
transformed_features, inputs)
return serving_input_fn
The change was get the inputs from deprecated function from contrib - input_fn_utils and then apply transformation then following with creating an ServingInputReceiver() function and returning it!

How can I generate 100 random number of nodes from the whole graph Gephi Toolkit?

I'm working on a project and I'm using Gephi toolkit.I need to generate 100 random number of nodes from the whole graph.
public void script() {
//Init a project - and therefore a workspace
ProjectController pc = Lookup.getDefault().lookup(ProjectController.class);
pc.newProject();
Workspace workspace = pc.getCurrentWorkspace();
GraphModel graphModel = Lookup.getDefault().lookup(GraphController.class).getGraphModel();
PreviewModel model = Lookup.getDefault().lookup(PreviewController.class).getModel();
ImportController importController = Lookup.getDefault().lookup(ImportController.class);
FilterController filterController = Lookup.getDefault().lookup(FilterController.class);
AppearanceController appearanceController = Lookup.getDefault().lookup(AppearanceController.class);
AppearanceModel appearanceModel = appearanceController.getModel();
//Import file
Container container;
try {
// File file = new File(getClass().getResource("/org/gephi/toolkit/demos/polblogs.gml").toURI());
File file = new File(getClass().getResource("/org/gephi/toolkit/demos/Book3.csv").toURI());
container = importController.importFile(file);
container.getLoader().setEdgeDefault(EdgeDirectionDefault.DIRECTED); //Force DIRECTED
} catch (Exception ex) {
ex.printStackTrace();
return;
}
//Append imported data to GraphAPI
importController.process(container, new DefaultProcessor(), workspace);
//See if graph is well imported
DirectedGraph graph = graphModel.getDirectedGraph();
System.out.println("Nodes: " + graph.getNodeCount());
System.out.println("Edges: " + graph.getEdgeCount());
This code returns number of nodes and edges..but I can't find a function to extract a subset of nodes randomly...I need to print out number of nodes not all the nodes because i'm working on Genetic algorithm and i'm needing to generate initial population...please any idea.
There is probably a nicer way but you could use the NodeIterator returned from graph.getNodes()
Or use graph.getNodes().toArray() which will return an array of nodes.
You could then extract 100 random nodes from there making use of Math.random.
If you add your results to a set you can be sure that you aren't getting the same node more than once.
Not tested but something like this...
static final int POPULATION_SIZE = 100;
public Set<Node> getInitialPopulation(Graph graph){
Node[] nodes = graph.getNodes().toArray();
Set<Node> initialPopulation = new HashSet<>();
if(nodes.length < POPULATION_SIZE){
for(Node node : nodes){
initialPopulation.add(node);
}
return initialPopulation;
}
while(initialPopulation.size() < POPULATION_SIZE){
initialPopulation.add(nodes[(int)(Math.random()*nodes.length)]);
}
return initialPopulation;
}

Compiler error when combining Linq + "RangeVariables" + TPL + DynamicTableEntity

I'm looking at the Microsoft-provided sample "Process Tasks as they Finish" and adapting that TPL sample for Azure Storage.
The problem I have is marked below where the variable domainData reports the errors in the compiler: Unknown method Select(?) of TableQuerySegment<DynamicTableEntity> (fully qualified namespace removed)
I also get the following error DynamicTableEntity domainData \n\r Unknown type of variable domainData
/// if you have the necessary references the following most likely should compile and give you same error
CloudStorageAccount acct = CloudStorageAccount.DevelopmentStorageAccount;
CloudTableClient client = acct.CreateCloudTableClient();
CloudTable tableSymmetricKeys = client.GetTableReference("SymmetricKeys5");
TableContinuationToken token = new TableContinuationToken() { };
TableRequestOptions opt = new TableRequestOptions() { };
OperationContext ctx = new OperationContext() { ClientRequestID = "ID" };
CancellationToken cancelToken = new CancellationToken();
List<Task> taskList = new List<Task>();
var task2 = tableSymmetricKeys.CreateIfNotExistsAsync(cancelToken);
task2.Wait(cancelToken);
int depth = 3;
while (true)
{
Task<TableQuerySegment<DynamicTableEntity>> task3 = tableSymmetricKeys.ExecuteQuerySegmentedAsync(query, token, opt, ctx, cancelToken);
// Run the method
task3.Wait();
Console.WriteLine("Records retrieved in this attempt = " + task3.Result.Count());// + " | Total records retrieved = " + state.TotalEntitiesRetrieved);
// HELP! This is where I'm doing something the compiler doesn't like
//
IEnumerable<Task<int>> getTrustDataQuery =
from domainData in task3.Result select QueryPartnerForData(domainData, "yea, search for this.", client, cancelToken);
// Prepare for next iteration or quit
if (token == null)
{
break;
}
else
{
token = task3.Result.ContinuationToken;
// todo: persist token token.WriteXml()
}
}
//....
private static object QueryPartnerForData(DynamicTableEntity domainData, string p, CloudTableClient client, CancellationToken cancelToken)
{
throw new NotImplementedException();
}
Your code is missing a query. In order to test the code I created the following query:
TableQuery<DynamicTableEntity> query = new TableQuery<DynamicTableEntity>()
.Where(TableQuery.GenerateFilterCondition("PartitionKey", QueryComparisons.Equal, "temp"));
I also added the method QueryPartnerForData which doesn't do anything (simply returns null) and everything works fine. So maybe it's an issue with the QueryPartnerForData method? The best way to find the actual error is by setting a breakpoint here and there.
A StackOverflowException often means you are stuck in an endless loop. Run through the breakpoints a few times and see where your code is stuck. Could it be that QueryPartnerForData calls the other method and that the other method calls QueryPartnerForData again?

Resources