I have a freemarker template to generate xml from pojo.during this conversion i need to populate some extra datas also to template.if template processing is sequential,so the execution time will be higher.So is it possible to execute some parts of template generation asunchronously or invoking some api's asynchronously.
There's no feature for parallelizing the execution of a single template.
Related
Using RTK Query code generation I have a generated slice of my API from an OpenAPI spec. Following on from that example I have extended the generated slice as described by using generatedApi.enhanceEndpoints({/**/}).
Now I want to add prepareHeaders to the slice which is typically set via fetchBaseQuery, and per the docs my use case is for adding an auth token to each request. As the createApi function is called within the generated file I'd like to avoid touching this to include custom logic.
I think I'm looking for something like generatedApi.enhancePrepareHeaders({/**/}) which does not seem to exist yet.
How do I set headers for all requests when following the code splitting approach and without touching the generated file?
At the moment, that is only possible by writing a custom baseQuery function wrapping the original fetchBaseQuery.
From the next version of the code generator on, it will only create injectEndpoints calls and leave all the baseQuery configuration to a non-generated file.
I am getting some numerical data with API from URL and I am looking for a way to make some mathematical operations in apache nifi before putting data to file directory. Thanks already now.
By the way, I am using InvokeHTTP processor to get data and to put file in somewhere I am using PutFile processor. I searched some related websites but I could not find out a working way.
Try using QueryRecord processor and Define Record Reader/Writer controller services to read/write the flowfile.
Add new property to the QueryRecord processor by using Apache calcite SQL query with your mathematical operations on flowfile.
Results of the SQL query will be added to the outgoing flowfile in your desired format.
Ultimately the answer depends on whether the data you're working with is in the content of the FlowFile or in the attributes. If the data is small enough and it's only a couple operations, the suggested approach would be to work with the data as attributes and use NiFi's expression language to do the transformations.
There is a section of mathematical operations[1] in the Apache documentation[2]. The operations range from simple operand like plus/minus to exposing the java.lang.Math static methods.
[1] https://nifi.apache.org/docs/nifi-docs/html/expression-language-guide.html#numbers
[2] https://nifi.apache.org/docs.html
You can try ExecuteStreamCommand if you want to intake the whole file and then run operations. Alternatively, you can fiddle around with the variables on the flowfile - depending on how large your operation is.
For example if you have some initial variables you can include them in the name of your file and then extract them, run the operations within the variables of the flowfile, then add to the bottom of the original file
Sometimes I'm facing too big Elasticsearch queries with duplicated parts with applying the same filtering structure into aggregations (for every aggregation field). Such queries are too massive for inspecting them. Is there any way to decrease request body size? A kind of aliases maybe, I need something like variables in YAML. Or maybe you could suggest something else. Thanks!
Please have a look on search templates. You'll be able to store query templates in the cluster, use variables and even build dynamic queries:
https://www.elastic.co/guide/en/elasticsearch/reference/current/search-template.html
Using this feature will reduce your request body dramatically as you'll just refer a pre registered template, providing some parameters if needed.
Repeating blocks and conditional sections are possible using mustache templating language http://mustache.github.io/mustache.5.html
Have fun!
I'm using an application that is getting me some data, and then renders a config file based on a given Go Template. You basically pass a template you've made as a parameter, and app does it's job with it. The template is getting bigger and bigger, so I wanted to wrap some common stuff into sub-templates (I mean, {{ define x }}). The problem I'm occuring is that the sub-template should be passed serveral parameters, which are not a part of my 'dot', and I can't really find a way to do this in Go.
The best answer I've found is to write some 'dict' function myself, and then use it inside the template, but that would mean I basically need to fork the whole application I'm using to render the template, do like 10-15 line changes, and then use this modified versions, which is a nonsense.
I'm wondering if there's any real solution for my problem without having to do some crazy forking and writing custom methods on application side?
Edit:
I've already checked Calling a template with several pipeline parameters before, although it's not answering my question, since I need a way to do this using only template file.
AM developing a auto completion or suggestion box using AJAX and servlets . My problem is how to parse the XML response in java script to show it in a div tag.My xml response is like contains one parent tag RESULTS , it contains number of children tags called RESULT.
How to get result values in java script variables.
You could do this using jQuery.parseXML
Take look on this thread: Parse XML from XMLHttpRequest
I think it contains information you need.