Syncthing Usage Report - Data syncing? - syncthing

I have configured synching usage report locally and I am able to access web UI locally. Now I need to sync my syncing data to usage report database. 1-Is there any way to auto sync syncthing data to usage report database? 2-If no, then How i can fill report table ?

We can set the reporting URL in the Syncthing config, and it’ll report in every 12 hours or thereabout.
Reference available here https://forum.syncthing.net/t/syncthing-usage-report-data-syncing/11202/2

Related

Azure Blob Storage lifecycle management - send report or log after run

I am considering using Azure Blob Storage's build-in lifecycle management feature for deleting blobs of a certain age.
However, due to a business requirement, it must be possible to generate a report or log statement after each daily execution of the defined ruleset. The report or log must state the number of blob blocks that were affected, e.g. deleted during the run.
I have read through the documentation and Googled to see if others have had similar inquiries, but so far without any luck.
So my question: Does any of you know if and how I can get a build-in Lifecycle management system to do one of the following after each daily run:
Add a log statement to the storage account containing the Blob storage.
Generate and send a report to an endpoint I define.
If the above can't be done I will have to code the daily deletion job and report generation myself, which surely I can do, but I would like to use the built-in feature if possible.
I summarize the solution as below.
If you want to know which blobs are deleted every day, we can configure Diagnostics settings in the storqge account. After doing that, we will get the logs for read, write, and delete requests for the blob. For more detail, please refer to here and here
Regarding how to enable it, we can use PowerShell command Set-AzStorageServiceLoggingProperty.

run report through form

what is the wrong in this?
i am trying to run 11g report through 11g form but iam getting this message.
[1]: https://i.stack.imgur.com/hHcYY.jpg
the report is working correctly from report builder.
[1]: https://i.stack.imgur.com/9CmS9.jpg
this is the button in form
[1]: https://i.stack.imgur.com/VzKdq.jpg
and this is the connect code.
declare
p_id paramlist;
begin
p_id :=get_parameter_list('tmpdata');
if not id_null(p_id) then
destroy_parameter_list(p_id);
end if;
p_id:=create_parameter_list('tmpdata');
add_parameter(p_id,'paramform',text_parameter,'no');
add_parameter(p_id,'p_1',text_parameter,:block2.t1);
web.show_document('http://mohamed-pc:7001/reports/rwservlet?report=D:\test\pharmacy\med_by_company.rdf&userid=pharmacy/pharmacy#orcl&destype=cache&paramform=htmlcss');
end;
The crash no longer occurs when Reports Server tracing is switched off. This may be due to the size of the trace file and that there was insufficient disk space / memory/cpu available to create it.
The problem does not reproduce when using rwrun / rwrun.sh because rwrun does not run the request via the Reports Server, and therefore no tracing takes place.
It should be noted that Reports Server tracing can have a major impact on performance. It is recommended that Reports Server tracing is only enabled when diagnostic information is required to troubleshoot a problem with a report, an error or a crash. Report Server tracing should not be enabled by default, especially in a production environment.
Running a Report via rwrun Performs Much Faster Than Using rwservlet or rwclient
Switch off Reports Server tracing by commenting out the xml tag
relating to tracing in the Reports Server conf file.
Change <trace traceOpts="trace_all"/>
to <!--trace traceOpts="trace_all"/-->
Stop and start the Reports Server for the change to take effect.
Reports Server Configuration File :
The configuration settings for the Reports Server component of Oracle
Reports Services are stored in the XML file rwserver.conf and
rwbuilder.conf, located in the directory : ORACLE_HOME\reports\conf

How to use snapshot and caching functions without actually storing credentials in SSRS

I have developed few testing reports in my local machine. I came across few mechanisms called Snapshot and Caching. I am trying to implement those in my reports, every time when i try to create Caching mechanism it throws me an error "credentials need to be stored. "
Can we use caching and snapshot by using Windows credentials?. if so what is the approach.
My local machine details.
Serername-(loacl)
Authentication -- (Windows)
name and pwd-- gryed out
my reportserver URL: satish-pc/reportserver
DB-Adventure Works.
Scheduled snapshots or caching plans mean the report is being executed on an automated basis, and the results stored for easier/faster retrieval later. As the executions are autmoated and unattended, they need connections with stored credentials, as there is no user sitting at the computer at run-time to punch in credentials. So, in order to use snapshots or scheduled caches, you will need to create a data source that has credentials stored in it. In Report Manager, you can edit the Report's datasource in the Report Properties page, or the shared datasource's connection into on its own properties page.

Storing ssrs reports in a file that can be called immediately

Hi Fellow SSRS Developers,
I have a scenario that I'm trying to tend to but need to know if what I want to do is even possible.
I have 4 reports that I would like to have run and then store the actual report in a file on a server. The reason for this need is because the response time on the reports is a bit long and I've done everything in SQL to speed it up.
What I want to have happen, is when a user clicks on the report name, instead of rendering the report on their screen I simply want to call the report that is already in a file so that it will load in lightning quick time.
Has anyone ever done this with SSRS and is it even possible?
Thanks,
Other than running reports on demand, there are two specific options: Running from a Cached report and running from a Snapshot.
You can see details on all of this in Setting Report Processing Properties.
Caching
From Books Online:
To enhance performance, you can specify a report (and data) to be
cached temporarily when a user runs the report. The cached copy is
subsequently available to other users who access the same report. With
this approach, if ten users open the report, only the first request
results in report processing. The report is subsequently cached, and
the remaining nine users view the cached report.
So here you can see that it is a specific user action that causes a stored report to be created.
See Report Caching in Reporting Services.
Snapshots
From Books Online:
A report snapshot is a report that contains layout information and
data that is retrieved at a specific point in time. You can run a
report as a report snapshot to prevent the report from being run at
arbitrary times (for example, during a scheduled backup). A report
snapshot is usually created and subsequently refreshed on a schedule,
allowing you to time exactly when report and data processing will
occur. If a report is based on queries that take a long time to run,
or on queries that use data from a data source that you prefer no one
access during certain hours, you should run the report as a snapshot.
Here you can see that these are these are generally set up on a regular schedule, i.e. independent of user activity.
See Creating, Modifying, and Deleting Snapshots in Report History.
In this case it seems like Snapshots might be your best option so you have more control of when the stored report is created. The main issue with Snapshots is that they need either stored credentials or an unattended execution account so might not be possible in all cases.

Publishing SQL Data Tools 2012 project: Forces into Single User Mode

I have a CLR Project that I'm trying to publish using Visual Studio. I had to change the project to a SQL Data Tools project, and now it's not publishing. Each time I try, I get a timeout error. When I take it step-by-step, I find this line of code hangs on my server.
IF EXISTS (
SELECT 1
FROM [master].[dbo].[sysdatabases]
WHERE [name] = N'fwDrawings')
BEGIN
ALTER DATABASE [fwDrawings]
SET READ_COMMITTED_SNAPSHOT OFF;
END
Basically, I know it's trying to force the server into single user mode when I try to publish this up. It's just to my staging server and not to a production server, but this is still a problem. I can't keep kicking everyone off the server and try and switch it to single user mode every time I want to update the CLR while I'm testing it's functionality. And I don't want to wait for a maintenance cycle or down-time to promote it up to production. Is there a way around this?
Presumably you have READ_COMMITTED_SNAPSHOT turned on for your database.
If this is the case, you need to change your Database project settings to match. Check "Read committed snapshot" transaction isolation, within the Operational tab in Database Settings for the project.
For me, this prevented the publish timing out, i.e. I can now publish successfully.
For a safer way to deploy to a server that's in use, try using a schema comparison instead.

Resources