SonarQube webservice API for delta load? - sonarqube

I see that Sonarqube provides an Webservice API to get all issues and I will load this data in to a database for analysis. Then, I wish to have my reporting database in sync with the changes happening in the system. Do we have a Webservice API that captures change data?
Overall, I want reporting DB to be in sync with the system.

The createdAfter parameter of the issues search web service will get you new issues, but there's no analogous updatedAfter parameter. Note that by only looking at added issues, you'll overlook issues that are closed in a new analysis.

Related

NiFi: Modify flow without disruption or downtime using Java API

Is there a way to modify NiFi flow dynamically using Java API? The use case is to add a processor to an active data flow (data is flowing through it). The new processor should be added at the beginning of the flow without application disruption or downtime.
In case Java API is not available, please feel free to suggest alternatives. I have already looked at change-nifi-flow-using-rest-api-part-1. Thanks.
Any action you can perform from the UI can also be performed from REST API, the UI is just making calls to the REST API behind the scenes.
I would suggest opening Chrome's Dev Tools and performing the action you are interested in and then seeing what requests were made to perform the action. You can then script these operations however you need.
In addition, if you are trying to deploy flows then you should be taking advantage of NiFi Registry which allows you to place a flow under version control. You can then make changes from your local instance or dev instance, and upgrade the flow in production in-place without stopping your whole NiFi instance.

Can Dexie syncable be used with an api server vs directly with a database

I have seen the sample projects on your website for Dexie.Syncable such as sync-server and sync-client and they all seem to write to a datbase directly vs interacting with a web api. I am looking for a little help in where to get started beyond the examples on the website. The api I am trying to write a gateway for is dreamfactory
Also it looks like version 2 beta has had many improvements to Dexie.Syncable
I would recommend to build a new server-project based on either WebSocketSyncServer.js or the github repo of sync-server. However, I cannot give the details on how to call REST APIs instead of working directly towards database or memory. I would suggest using ES2016 async/await since your API calls are asynchronic.
Maybe you could try getting more help on https://github.com/nponiros/sync_server by filing an issue there.

Programmatically generate VersionOne reports for multiple projects

My organization is tracking multiple Scrum projects in VersionOne. Each week, we use the Release Forecasting report for each project to create a management dashboard that indicates the health and expected completion date of each project. I would like to automate this. Do any of the VersionOne APIs allow for the execution of this report and retrieving the image that is generated?
There is not an endpoint specific to Release Forecasting. Nor is there an endpoint to generate the image. However, you can get to the underlying data via the existing API endpoints. For reporting, I recommend query.v1. The closest example is the query for burndown data. You would need to take Scope as the focus of the query, not Timebox.
You might also take a look at VersionOne's Reporting and Analytics. While that is not a coding or API way to get the reports, it might still automate the needs you have.
I was able to automate the retrieval of this report, but not through the V1 API. Through careful use of Fiddler and a C# script using WebClient to execute POST requests, it was possible. The resulting code is pretty fragile, though, since it isn't using the API.

ETL of Human Resource data from Taleo

My company needs to migrate data from a Taleo system to a new HR system.
A little research suggests that traditional ETL may not work against the Taleo cloud based system, but I don't know enough about the setup and am trying to learn.
Does anyone have experience migrating HR data from Taleo to another system, and, if so, how did you do it, and was traditional ETL an option?
Thanks
How you access Taleo depends as much on your platform as theirs.
Example: I'm using Windows:
not sure if this is my mistake ~~ vs2010 Add Service Reference fails
Taleo has just released a new version that as has killed a number of companies temporarily.*
Whether your ETL is one time or continous, Taleo provides a .PDF version of their API that works as follows for employee records (I'm only grabbing their employee records). Other records appear to use the same paradigm.
Employee records have two types of fields: fixed and user defined. The fixed fields which I work with in c# are like simple properties of a class and can be accessed with standard .name notation such as taleoItem.ManagerId. The user defined names are in list of "beans" ... for each bean, one looks first at its name ( *foreach (var taleoItem in taleoEmployeeBean.flexValues) ... if (taleoItem.fieldName == "Social Club Member") { ... ). * currently I'm getting zero of the 50+ flexbeans that I normally get and two flexbeans that I've never before seen. as can be expected, until Taleo fixes this breakage, all that I can to is twiddle my thumbs
When Taleo works properly, retrieving data generally works like this.
access a fixed url to get a url for your company;
authenticate via the url retrieved from step 1 to get a session token.
use the session token from step 2 to invoke the various Taleo API methods.
Warning: the Taleo API has documentation errors. Also, the test cases will not necessarily work.
I'm not familiar with Taleo, but according to their website they have features that allow integration via "XML, Web Services, reusable components, and standard APIs". There are many ETL tools on the market that can interface with web services as a source, or you could optionally write your own.
Taleo provides a PDF which described all the calls that can be made. Basically Taleo uses SOAP as web-service for accessing their data.
For a detailed description visit Taleo Integration in Drupal

Performance testing application for bottle necks using production data

I have been tasked with looking for a performance testing solution for one of our Java applications running on a Weblogic server. The requirement is to record production requests (both GET and POST including POST data) and then run these requests in a performance test environment with a copy of the production database.
The reasons for using production requests instead of a test script are:
It is a large application with no existing test scripts so it would be a a large amount of work to write scripts to cover the entire application.
Some performance issues only appear when users do a number of actions in a particular order.
To test using actual user interaction with the system not an estimation at how the users may interact with the system. We all know that users will do things we have not thought of.
I want to be able to fix performance issues and rerun the requests against the fixed code before releasing to production.
I have looked at using JMeters Access Log Sampler with server access logs however the access logs do not contain POST data and the access log sampler only looks at the request URL so it cannot simulate users submitting form data.
I have also looked at using the JMeter HTTP Proxy Server however this can record the actions of only one user and requires the user to configure their browser to use the proxy. This same limitation exist with Tsung and The Grinder.
I have looked at using Wireshark and TCReplay but recording at the packet level is excessive and will not give any useful reports at a request level.
Is there a better way to analyze production performance considering I need to be able to test fixes before releasing to production?
That is going to be a hard ask. I work with Visual Studio Test Edition to load test my applications and we are only able to "estimate" the users activity on the site.
It is possible to look at the logs and gather information on the likelyhood of certain paths through your app. You can then look at the production database to look at the likely values entered in any post requests. From that you will have to make load tests that approach the useage patterns of your production site.
With any current tools I don't think it is possible to record and playback actual user interation.
It is possible to alter your web app so that is records and logs every request and post against session and datetime. This custom logging could be then used to generate load test requests against a test website. This would be some serious code change to your existing site and would likely have performance impacts.
That said, I have worked with web apps that do this level of logging and the ability to analyse the exact series of page posts/requests that caused an error is quite valuable to a developer.
So in summary: It is possible, but I have not heard of any off the shelf tools that do it.
Please check out this Whitepaper by Impetus Technologies on this page.. http://www.impetus.com/plabs/sandstorm.html
Honestly, I'm not sure the task you're being asked to do is even possible, let alone a good idea. Depending on how complex the application's backend is, and how perfect you can recreate the state (ie: all the way down to external SOA services or the time/clock), it may not be possible to make those GET and POST requests reproduce the same behavior.
That said, performance testing against production data is always great, but it usually requires application-specific knowledge that will stress said data. Simply repeating HTTP GETs and POSTs will almost certainly not yield useful results.
Good luck!
I would suggest the following to get the production requests and simulate the accurate workload:
1) Use coremetrics: CoreMetrics provides such solutions using which you can know the application usage patterns. This would help in coming up with an accurate workload model. This model can then be converted into test scripts and executed against a masked copy of production database. This will provide you accurate results about the application performance in realtime.
2) Another option would be creating a small utility using AOP (Aspect oriented apporach) so that it can trace all the requests and corresponding method traces. This would help in identifying the production usage pattern and in turn accurate simulation of workload. AOP frameworks such as AspectJ can be used. This would not require any changes in code. The instrumentation can be done on the fly. The other benefit would be that thi cna only be enabled for a specific time window and then it can be turned off.
Regards,
batterywalam

Resources