Get all days from given month&year with Azure synapse dataflow expression - expression

I need help with getting the dates of a given month&year. I am using Azure Synapse Dataflow Expression and tried different things but I could not do it. There is no other similar question for Dataflow here.
I also tried following but get Null value out.
toDate(concat(concat({PeriodeMåned},'-'),{PeriodeRegnskapsår}),'yyyyMMdd')
I have the following format:
f.eks
Looking for a list of all the dates in the following format.
How I need it
Any help will be greatly appreciated.

Related

Looking for multi database data processing for spring batch

I have a situation where I need to call 3 databases and create a CSV.
I have created a Batch step where I could get the data from my First database.
This gives around 10000 records.
Now from each of these records I need to get the id and use it to fetch the data from other data source. I could not able to find best solution.
Any help in finding the solution is appreciated
I tried two steps for each data source but not sure how to pass the ids to next step. ( we are talking about 10000) ids.
Is it possible to connect to all 3 databases in the same step? I am new to Spring batch so not have full grasp of all the concepts.
You can do the second call to fetch the details of each item in an item processor. This is a common pattern and is described in the Common Batch Patterns section of the reference documentation.

How to look for S&P 500 Constituents history, added and removed dates etc

I am trying to get a historical list of the S&P500 underlying stocks mix. all tickers the dates were added to the S&P500 index mix and the dates tickers were removed from the list. and throughout the years for each period what is the mix. I did some search, doesn't seems to have any luck.
if anyone can provide some good search keywords, or suggest a place to look for would be appreciated
this is something very specific.
I currently use backtrader to work on some data. if there is a systematic way to get the data, please let me know as well.
many thanks.
You can access this data systematically in QuantRocket, via data provider Sharadar:
https://www.quantrocket.com/data/?filter=sharadar

How can I do scraping on a Table in Google Sheets?

I was wondering how to get the table from the following webpage. I imagine that is possible using importHTML function, but it was not possible to identify the path. I think that is masked or is not accessible.
Does someone know how to get the correct path or how to import this table?
Thank you
I don't think GoogleSheets could handle it. As stated by #wp78de, a lot of scripts are running in the background. But you can access the data directly in JSON format here :
Data

Showing multiple metrics with conditions and calculation in visualize tool in Kibana

I want to create a table in the Visualize tool in Kibana where I show several metrics with condition and calculcation. I have created a generic example in excel, see below. I know the basics. On how to produce the first two columns, be the other ones are harder. I tried looking into adding JSON input with adding another Count and adding a script, but i dont get it to work unfortunately. Any ideas?

Elasticsearch to index RDBMS data

These are three simple questions which was surprisingly hard to find definite answers.
Does ElasticSearch support indexing data in RDBMS tables ( Oracle/SQLServer/Informix) out of the box?
If yes, can you please point me to documentation on how to do it
If not, what are alternate ways (plugins like Rivers - deprecated) with good reputation
I'm surprised there isn't any solid answer as yet for this. So here's the solution. Logstash directly gives us the ability to push data from a RDBMS into Elasticsearch.
Here's a link to a tutorial which tell you how to go about it. Briefly(all details in link 1), you simply need a JDBC driver for the relational database you'll be using (Postgres, MySQL etc) and make a config file specifying your input as the Relational Database and your output as Elasticsearch. You can also specify a cron which would allow you to keep updating one regular intervals.
Here's the article which mentions the configuration and gets you started (See Example 2): https://www.elastic.co/blog/logstash-jdbc-input-plugin
Here's the article which tells you how to configure the Cronjob as such: https://www.elastic.co/guide/en/logstash/current/plugins-inputs-jdbc.html#_scheduling

Resources