how does jenkins ui show log files? - spring

Currently I am developing an application that exposes file contents especially of log files with spring-boot to a web-app driven by react.
I really like the log view of jenkins and asked myself how they are handling this. Unfortunately I couldn't find the log viewer in the source code.
Can someone please give me a hint how they are updating the file on server and client side or just give me their source?

Custom log view implementation might not be required
Spring boot has a special endpoints called actuator api
One of this endpoints is : /actuator/logfile which is used to view the spring boot log :
curl 'http://localhost:8080/actuator/logfile' -i -X GET
how they are updating the file on server and client side
Update log file in server is not our problem.
A cool view in web is our concern.
I used a node.js implementation. Is ready to use and you could take some ideas:
https://github.com/mthenw/frontail
I think a combination of web sockets, css and a correct file stream operation (open, close) and memory management, could be necessary to reach your goal.

Related

Can I have two or more web processes using Heroku

I'm trying to implement reasonably complex architecture using Heroku. I have a Java application that reads/writes data from one source using REST and puts results onto a queue using RabbitMQ. A Django application then reads from this queue parsing data collected then saves to it's database. The Django application feeds Android and ISO apps through GraphQL. The problem I have is Heroku only seems to let me define one web process in my Procfile where in fact I need two. One for the Java application and one for the Django application. Is there anyway I can make this work?
There is no un-hacky / good solution to this. And as the comments stated, it is a bad idea to combine codebases here.
Following Heroku's ideas here, you would split these into separate applications/services that communicate to each other via HTTP or the queue.
Many addons you can attach to multiple applications if they are shared. So you have the same queue.

How to get multi-part form data in a self hosted WCF?

I've been searching for a quite while now and not found what I'm looking for. I've self hosted a http WCF in a windows application. Now in one of my service's methods I need to receive a file and some form data fields. In similar questions the case is either sending one single file (Which is done by streaming data and then converting it) or they have hosted WCF in an environment that Asp.Net Compatibility can be turned on to access HttpContext and then getting all the needed data from HttpContext.Current. Any Suggestions ?
After a while I found out that someone has already answered my question. for anyone out there facing a similar problem here is the link to the answer :
https://stackoverflow.com/a/14514351/11797674
this approach also uses stream input but not for a single file. It also can manage to get multiple files and form data by key. the sample presented in the answer is a little bit old. I suggest that you check the git repository and follow the sample there. It works like a charm for Http WCF's that are self hosted on a application type that is not web application and Asp.Net Compatibility mode activation is not a option (Since Sessions are different than web application and no http context is saved by the application which WCF is self hosted on)

Spring Boot File Conversion Microservice as a Cloud Foundry User-provided Service

I have a Spring Boot REST Microservice which expects an input file and converts this file to other formats using ffmpeg like so
ffmpeg -i <INPUT_FILE> -vf 'scale=320:-2' <OUTPUT_FILE>
I am calling this command at the moment in a Java ProcessBuilder and referencing the container location of the input and output files after pushing the microservice to PCF.
The ffmpeg binary and the input file are packaged in the jarfile
I understand I need to use cloud storage like NFS or S3 to specify the locations, but that is a secondary matter now.
My idea is to make the Microservice a PCF User-provided Service, so that bound Apps will supply the location of the input and converted files.
Also since there are different conversion functions, I have corresponding endpoints for each conversion function.
All examples I have seen with respect to Microservices have to do with Databases whereby you specify information like URL and credentials to access the external database,
which does not shed any light into what I have in mind.
So I have the following questions:
1) Is it possible to simply convert the running Conversion Microservice into a PCF User-provided Service? I know of the CUPS command, but I am not sure what to supply as Parameters, also since I have several endpoints
2) How could bound Apps call the endpoints of this Service and provide the locations of the input and output files.
I will appreciate code snippets if possbile
Thanks
What you're trying to do should be possible, but not quite in the way you describe it.
For starters, you can use a user provided service to pass details about a microservice between apps. Thus if you create a conversion microservice, you can pass information about the microservice to consuming apps via a user provided service.
Where I think you need to change your plans a bit is with the consuming apps. You can't pass the input/output files via a user provided service, at least not without sharing storage between the two and you don't want to go down that road. It's limiting as all apps have to share the storage and it will have scaling limits.
Instead what you'd want to do is to put the location and credentials required by a consuming party in the user provided service (this is what you're seeing with databases). For example, you'd probably need the hostname or route to your microservice and likely a user token/secret or username and password (that way you know who's making requests to your service).
Your consuming app could then read the information out of the bound user provided service and use that to contact your conversion microservice. At that point, it would be up to the consuming app to understand the API that you expose from your microservice so that the consuming app can send the file to be converted to the microservice and retrieve the processed file. You'd likely send/retrieve files over HTTP, and if I may make a recommendation you probably want to look at using the claim check pattern on your service since doing conversions can be time consuming.
Hope that helps!

Streaming log messages of a Spring boot web application

This question is for asking general advice on what tools should I use for this task, and possibly pointing me to some related tutorials.
I have a Spring Boot web application, which during operation generates log messages into a database. There is a JavaScript management tool in the making for this REST application, and 1 function of it would be to show the log messages real-time. Meaning, when the user is on the log showing page, he should see the new log messages appearing without refresing the page.
My questions:
What should be used to provide this for the javascript client at some endpoint? I'm looking at these spring boot starters right now: websocket, redis, amqp. I have not used any of these before.
How should I "catch" the log messages as they are generated inside the application? So I could send them to the client with the chosen solution.
I'm not really looking for a periodic query type of solution, but rather a server pushing the data as it appears solution.
Any suggestions and code samples are appreciated.
Storing logs in a database is usually not a good option unless you use a database which is capable of handling a lot of write requests, such as Apache Cassandra. Streaming data from a database is not the most intuitive thing to do, however.
A modern alternative is to use a messaging system such as Apache Kafka to stream logs from producing systems to multiple subscribing systems. There are multiple ways how you can achieve that. For example, for streaming logs from your Spring Boot app you could use a special log4j appender (see here and an example here). To be able to present logs in a web browser in real-time, you will need another backend system which will receive the log records from Kafka topics and forward them to JavaScript web clients via websockets, most likely using a publisher/subscriber model.
Also, you could consider using server sent events (SSE) instead of websockets. Because you have only a unidirected message flow (logs are streamed from a backend system to a javascript client in the browser, but not the other way around), SSE can be a good option as a replacement for websockets. Websockets are more difficult to operate than SSE and usually use more resources at the backend. As always, you will need to choose between trade-offs (see here).

How to log Dynamics CRM standard SOAP API calls?

Context
We have an on-premise CRM (8.0) application, which is integrated with different legacy systems. There are approx 20 entities which are created/updated/upserted via the standard SOAP API by the legacy systems.
Question
I would like to log all the incoming requests and responses as SOAP/XML for diagnostics reasons. How can I accomplish this task?
(Note: I know the trivial, but not exactly fit solution to have workflows for create/update on all affected entities. This seems to be not universal enough + we ultimately must log the request text and response text itself)
I haven't tried it yet, but I think it should be possible to configure the native WCF tracing for the Organization Service. This is something really easy to do (it requires to add some configuration to the web.config file) and you will be able to log any request and response. You can take a look about how to configure it here.
EDIT:
In this link you will be able to see what I've just told you working (it was done for CRM2011 but it should works in newer versions): link

Resources