fhir-net-api (STU3) - Hl7.Fhir.Model.PlanDefinition parsing error - hl7-fhir

Using HL7.FHIR.STU3.Core, I am getting an invalid cast exception when I try and parse an PlanDefinition FHIR file.
Do I need to set the Schema for PlanDefinition file?
string HL7FilePath = string.Format("{0}\\{1}", System.IO.Directory.GetCurrentDirectory(), "ANA3.xml");
string HL7FileData = File.ReadAllText(HL7FilePath)
var b = new FhirXmlParser().Parse<Bundle>(HL7FileData);
Error
InValidCastException {"Unable to cast object of type 'Hl7.Fhir.Model.PlanDefinition' to type 'Hl7.Fhir.Model.Bundle'."}

You are trying to parse a PlanDefinition resource into a Bundle object, as the InvalidCastException tells you. If you change the Parse<Bundle> into Parse<PlanDefinition> your code should work fine.

Related

Elasticsearch - Convert Json Strin to XcontentBuilder

I was using elasticsearch 6.2.2. and this is how I convert json string to Xcontentbuilder.
XContentBuilder builder = JsonXContent.contentBuilder().prettyPrint();
XContentParser parser = JsonXContent.jsonXContent.createParser(NamedXContentRegistry.EMPTY, jsonObj.toString());
builder.copyCurrentStructure(parser);
I worked well until I updated elasticsearch 6.3+.
There is error on ES 6.3+ with same code.
Description Resource Path Location Type The method
createParser(NamedXContentRegistry, DeprecationHandler, String) in the
type JsonXContent is not applicable for the arguments
(NamedXContentRegistry, String) test.java
The Compile Error has called out: your createParser miss a DeprecationHandler parameter.
So you should set the DeprecationHandler, for example:
JsonXContent.jsonXContent.createParser(NamedXContentRegistry.EMPTY,
LoggingDeprecationHandler.INSTANCE,
jsonObj.toString());

Service Now : transform map stopped due to error: java.lang.NumberFormatException

The transform is getting aborted but only if I marked the checkbox copy empty fields and also the rest of the entry of the Import set is getting stuck at pending, also I verified the transform script but no luck.
Below is the error :
Import set: ISETxxxxxxx transform stopped due to error: java.lang.NumberFormatException
java.lang.NumberFormatException
at java.math.BigDecimal.<init>(BigDecimal.java:596)
at java.math.BigDecimal.<init>(BigDecimal.java:383)
at java.math.BigDecimal.<init>(BigDecimal.java:806)
at com.glide.script.glide_elements.GlideNumber.getSafeBigDecimal(GlideNumber.java:42)
at com.glide.currency.GlideElementCurrency.coerceAmount(GlideElementCurrency.java:406)
at com.glide.currency.GlideElementCurrency.cleanAmount(GlideElementCurrency.java:389)
at com.glide.currency.GlideElementCurrency.setDisplayValue(GlideElementCurrency.java:136)
at com.glide.currency.GlideElementCurrency.setValue(GlideElementCurrency.java:89)
at com.glide.db.impex.transformer.TransformerField.copyEmptyFields(TransformerField.java:202)
at com.glide.db.impex.transformer.TransformerField.setValue(TransformerField.java:130)
at com.glide.db.impex.transformer.TransformerField.transformField(TransformerField.java:84)
at com.glide.db.impex.transformer.TransformRow.transformCurrent(TransformRow.java:100)
at com.glide.db.impex.transformer.TransformRow.transform(TransformRow.java:69)
at com.glide.db.impex.transformer.Transformer.transformBatch(Transformer.java:150)
at com.glide.db.impex.transformer.Transformer.transform(Transformer.java:76)
at com.glide.system_import_set.ImportSetTransformerImpl.transformEach(ImportSetTransformerImpl.java:239)
at com.glide.system_import_set.ImportSetTransformerImpl.transformAllMaps(ImportSetTransformerImpl.java:91)
at com.glide.system_import_set.ImportSetTransformer.transformAllMaps(ImportSetTransformer.java:64)
at com.glide.system_import_set.ImportSetTransformer.transformAllMaps(ImportSetTransformer.java:50)
at com.snc.automation.ScheduledImportSetJob.runImport(ScheduledImportSetJob.java:55)
at com.snc.automation.ScheduledImportJob.execute(ScheduledImportJob.java:45)
at com.glide.schedule.JobExecutor.execute(JobExecutor.java:83)
at com.glide.schedule.GlideScheduleWorker.executeJob(GlideScheduleWorker.java:207)
at com.glide.schedule.GlideScheduleWorker.process(GlideScheduleWorker.java:145)
at com.glide.schedule.GlideScheduleWorker.run(GlideScheduleWorker.java:62)
I'm guessing you have a field that required that is a decimal or similar.
The error java.lang.NumberFormatException indicates it's failing to convert an empty string to 0.0.
Use a source script line to convert this, something along the lines of this
answer = (function transformEntry(source) {
if (source.u_number_field.nil())
return 0.0;
})(source);

Invalid format: "19690321" is too short

I am trying to convert yyyyMMdd format to yyyy/MM/dd format using pig for that i have written below code.
Code:
STOCK_A = LOAD '/user/root/xxxx/*' USING PigStorage('|');
data = FILTER STOCK_A BY ($1 matches '.*ID.*');
MSH_DATA = FOREACH data GENERATE ToDate($8,'yyyy/MM/dd','UTC') AS dob;
When i am trying to dump the result i am getting below error.
ERROR org.apache.pig.tools.pigstats.SimplePigStats - ERROR 0:
Exception while executing [POUserFunc (Name:
POUserFunc(org.apache.pig.builtin.ToDate3ARGS)[datetime] - scope-209
Operator Key: scope-209) children: null at []]:
java.lang.IllegalArgumentException: Invalid format: "19690321" is too
short
Sample:
EXVORV##PDULD21F|ID|1|483|1020783||EXVORV##PDULD||19690321|F|
$8 seems valid to me i am not able to locate the reason the issue is coming. Any help would be really appreciated.
You use :
ToDate($8,'yyyy/MM/dd','UTC')
but the format is
19690321
so you should have
ToDate($8,'yyyyMMdd','UTC')
The issue is most likely because of the load statement.Since you are not specifying the schema the datatype by default will be bytearray. You will have to convert it to chararray before passing the field to ToDate
STOCK_A = LOAD '/user/root/xxxx/*' USING PigStorage('|');
data = FILTER STOCK_A BY ($1 matches '.*ID.*');
MSH_DATA = FOREACH data GENERATE ToDate((chararray)$8,'yyyy/MM/dd','UTC') AS dob;

'SparkContext' object has no attribute 'textfile'

I tried loading a file by using following code:
textdata = sc.textfile('hdfs://localhost:9000/file.txt')
Error message:
AttributeError: 'SparkContext' object has no attribute 'textfile'
It is sc.textFile(...) with a capital F.
You can inspect the API of SparkContext here.

Acceleo java wrapping service doesn't take complex parameter - Invalid result for expression self.invoke

I can't call a java wrapping service in Acceleo because it doesn't recognize parameters type. This is my simple test code: the main calls a query stored in Services.mtl, that calls the java service that just return the name of an object "Send"
Main.mtl
[file ('system.P', false, 'UTF-8')]
[for (t : Send | aSystemBehavior.transitions)) ]
[getName(t)/]
[/for]
[/file]
Services.mtl
[query public getName(arg0 : Send) : String
= invoke('myPackage.Services', 'getName(myPackage.Send)', Sequence{arg0})
/]
Services.java
public class Services
{
public String getName(Send t)
{return t.getName();}
}
The Error Log shows:
Invalid result for expression
self.invoke('myPakage.Services',
'getName(myPakage.Send)', Sequence {arg0}) at line 0 in
Module services for query getName(Send). Last recorded value of self
was org.eclipse.emf.ecore.impl.DynamicEObjectImpl#1f00eb36 (eClass:
org.eclipse.emf.ecore.impl.EClassImpl#2c2aade3 (name: Send)
(instanceClassName: null) (abstract: false, interface: false)).
Problem found while generating the file system.P'.
If I use a String as parameter type instead of Send, everything works fine.
Does the package containing the service "Services" has been exported? If not, open the file MANIFEST.MF, go in the runtime tab and add its package to the list of exported packages. Are you sure that your "Send" object has a name? This message only indicates that null was returned by the query getName.
I don't have anymore this problem... I created a new Acceleo project from scratch, and it works. I am not sure what was the problem... maybe it's something about che choice of metamodels to import during the creation of the Module (I have to choose between run-tim and develop-time metamodel).

Resources