I have a URL which is not REST based, that when hit in a browser, results in the auto-download of a .txt file.
Is there a way to use NiFi to invoke this URL which would result in the download happening on the NiFI server?
Determined that this required the InvokeHttp processor. Works now.
Related
I want to implement following use case with Apache Camel FTP:
On a remote location I have 0 to n amount of files stored.
When I receive a command, using FTP, I want to download one file as a byte array (which one does not matter), if any files are available.
When the file is downloaded, I want to save it in a database as a blob.
Then I want to delete the stored/processed file on the remote location
Wait for the next download command and once received go back to step 1.
The files have to be downloaded through the same FTP session.
My problem is that if I use a normal FTP route, it downloads all available files.
When I tell the route to only download one, I have to create a new route for the other files and I cannot reuse the FTP session.
Is there a way to implement this use case with Apache Camel FTP?
Camel-ftp doesn't consume all available files at once it consumes them individually one after another meaning that each file gets processed separately. If you need to process them in some specific order you can try using file-name or modified date with sortBy option.
If you want to control when file gets downloaded i.e when command gets called you can call FTP Consumer endpoint using pollEnrich
Example:
// 1. Loads one file from ftp-server with timeout of 3 seconds.
// 2. logs the body and headers
from("direct:example")
.pollEnrich("ftp:host:port/directoryName", 3000)
.to("log:loggerName?showBody=true&showHeaders=true");
You can call the direct consumer endpoint with ProducerTemplate you can obtain from CamelContext or change it to whatever consumer endpoint fits your use case.
If you need to use dynamic URI you can use simple to provide the URI for poll-enrich and also also provide timeout afterwards.
from("direct:example")
.pollEnrich()
.simple("ftp:host:port/directoryName?fileName=${headers.targetFile}")
.timeout(3000)
.to("log:loggerName?showBody=true&showHeaders=true");
I have a working script to login and get to one website on the webserver, what I need is how to get to the other 10 plus servers with Jmeter all at once to do a nice stress test on the websites and its interfaces.
Any help is greatly appreciated
I think that you need to use DNS Cache Manager available since JMeter 2.12
DNS Cache Manager allows each JMeter thread to resolve underlying IP address of the request endpoint on its own.
See The DNS Cache Manager: The Right Way To Test Load Balanced Apps guide for detailed explanation of background and configuration details.
This is pretty trivial using the CSV Data Set Config.
Let's assume you are using normal HTTP Request samplers and that these are already set up with a server and path. Let's say it is the server you want to change for each thread. Then you need to:
Create a text file with a different server you want to test on each line.
Add a CSV Data Config element to the top level.
Configure the CSV Data Config to use your text file and set the variable name of server.
In your samplers change the server name to ${server}.
You can use the same method to change the path and other details.
I'm having a weird problem with JMeter.
Scenario:
Web app running on localhost
Record a simple test on Jmeter (login + 1 search)
Execute the test on localhost with Jmeter. Test runs OK.
Change the server and port on HTTP Request Defaul for another server's IP and port running the same version of the app.
The test runs but fails at the search with ".FlowExecutionRestorationFailureException: A problem occurred restoring the flow execution with key 'e3s2'"
If i do the same swapging servers (record on remote server and try to execute on local) the behavior is the same.
¿Any clues of what can it be? I don't understand why it manages to do the login and navigate on another server but fails on other action.
In short, if I record a test it fails at somepoint if I change the server.
Software_
Jmter 2.12
Primefaces 5.0
Spring Webflow 2.3.1.RELEASE
Apache Tomcat 7.0
My expectation is that there is at least one dynamic parameter which is currently being hardcoded into your script. I would suggest to do the following:
Record your login+search flow once again
Inspect 2 .jmx scripts to detect any differences (i.e. one or more parameters having different values)
Once you find those problematic parameters you'll need to look into server's response body/headers/cookies/ to see where it lives.
As soon as you know where the parameter value lives you can use one of the following PostProcessors:
Regular Expression Extractor
XPath Extractor
CSS/JQuery Extractor
The whole process is called "correlation" so you can use "JMeter correlation" as a search term if above information is not enough to resolve your problem.
The problem was some xhtml components that didn't had any specified Id so jsf would set something like id="mainForm:j_idt12". Since my test don't need to work on dynamic generated html (are simple tests) setting the ids solve the poblem.
am using liferay custom portlet and in that am using jasper report now my problem is that how can i download the pdf report directly on the client machine
right now am storing the file at server first.then provide url for downloading the pdf to user.but how can i directly store the file to client machine if i have pdf file's outputstream .
ot if i can know some how when user click on the download link and after downloading the file if i want to delete the donlowded file from the server then how can i do it.?if any one can guide me...
I'm not sure what you're asking for is possible, but I would be interested in seeing someone correct that statement though.
Servers really shouldn't be directly storing files on a client machine as that violates the intent of the client server relationship. A client has to make a request for the file and then the client can save that file (eg like a ftp download). Servers just don't manipulate client machines as they see fit.
As far as knowing when a file is downloaded, there isn't anything in a portlet you can do to detect that. You can use ResourceRequest and serveResource method to serve a file, but nothing in the portlet API will inform your portlet that the download is complete or that it wasn't interrupted by something.
As an alternative you might try simply having a cron job that will clean out old files. In this case, make sure to inform users how long they have to successfully download the file.
I have the BIRT Report Server configured in TOMCAT and it works fine when running reports that require an XML datasource, but that XML file has be available on the network in order for the server to find it and run. Is there an out of the box configuration in the BIRT server that will prompt the user to upload the XML file directly to the server when they try to run a given report that requires an XML data source? This would be handy for users that have the XML datasource stored locally on their C drive and not have to move them to a network server in order to be read by BIRT. Thanks in advance.
Paul
There is not an OOTB solution that does what you describe.
Without the OOTB option, the best way to handle this would be using Actuate's IDAPI. This will give you all the tools to get the file uploaded and added to the iServer. You can expose the IDAPI interface in any number of ways including on the BIRT report itself or on a custom parameter request page.