How can I make a copy of Data from a google sheet with importrange static? - google-sheets-formula

I have created a number of large result spreadsheets to store data that pull data from another master spreadsheet using importrange that can be updated throughout the year.
My problem is that at the end of the year I want to store the data values from my results spreadsheets from this year statically in say copies but then have the dynamic spreadsheets continue on the next year with new values etc being populated.
Worst case I can copy the spreadsheets and change the importrange links but there are lots of sheets and links so I'm wondering if there is a good way to simply make a copy of the sheet then make it static so it no longer pulls data from the master sheet but keeps it's values.
I tried to download to google sheets as an excel file and re upload but they break when re uploading to google drive.
How can I make a copy of Data from a google sheet with importrange static?

Related

Uploading a csv file to snowflake using fivetran

I want to upload csv files from google sheets to snowflake database. I would like to know if there is any option in fivetran such that only upserts(only modified rows) of these csv files are synced to snowflake table?
You can definitely use Fivetran to upload Google Sheets data into Snowflake: https://fivetran.com/docs/files/google-sheets Fivetran by default only updates modified rows (adds/deletes/updates), so it will indeed upsert if that Google Sheet is modified. I believe by default Fivetran will only 'soft-delete' deletions from source data, so I think if you delete a row from your Google Sheet it will remain in Snowflake, but will have a _fivetran_deleted column flag. I would test this explicitly if this matters to you.
I'm not sure what you mean by 'only upserts'. No deletions of data once it's stored in Snowflake from Google Sheets? If so, Fivetran may already do exactly what you want out of the box. Fivetran will definitely insert new rows and update existing ones.
I don't think you can do this with Fivetran, but you can two-way sync a Google Sheet with Snowflake using Wax in real time (e.g. as soon as the Sheet is edited, the update is sent to Snowflake).
Disclaimer: I made this.

How to copy and paste bulk data from Excel to Html Table and then save to Laravel

I want to copy and paste bulk of data, let's say 5-6 rows from the Excel sheet to the front end at once in the same format and then hit save and that data will be saved to the database using Laravel. I am not concerned about how do I save the data (I can do it) but I am concerned about how to bulk copy and paste multiple rows and columns to the front end.
How can I achieve this?
Let's consider this as Excel sheet table,
Link Here
I want to copy all the rows and to paste them at my front end so that I can save the data. How can I achieve this kind of bulk copy-pasting? Any suggestions or reference or help will be appreciated.

Google Script Performance and Speed Comparison with Google Sheet Formula

I have one thousand Google Form Responses spreadsheets. These are students answer sheets. I built a spreadsheet and pull data (TimeStamps and scores) for each student by using Google Spreadsheet formulas (INDEX MATCH and IMPORTDATA). Each student has different pages. But, it takes too many times and sometimes causes some source sheets being unresponsive (I think because of heavy formula usage). My questions;
Is it possible to do the same thing (pulling data if matches student's name from one thousand spreadsheets) by using Google Script?
If possible, which ones (Google Spreadsheets with formulas or Google Script) performance is better?
By looking your answers I will decide to begin learning Google Script or not.
Thanks in advance.
Is it possible to do the same thing (pulling data if matches student's name from one thousand spreadsheets) by using Google Script?
Yes, it's possible.
NOTE: Bear in mind that Google Sheets has a 5 million cell limit, so if your data exceeds this limit, you should consider to use another data repository.
If possible, which ones (Google Spreadsheets with formulas or Google Script) performance is better?
Since most Google Sheets formulas are recalculated every time that a change is made in the spreadsheet that holds them, it's very likely that Google Apps Script will be better when using Google Sheets/Google Apps Script as database management system because we could have more control over when the database transactions will be made.
Related
Measurement of execution time of built-in functions for Spreadsheet
Why do we use SpreadsheetApp.flush();?
Both does the same thing. Both will be as intensive on your computer. My advice would be to upgrade your PC!

Sending keys to Tableau with a script

I am actually using Tableau Software Desktop and saving my workbooks to .twbx (tableau data extract + tableau workbooks) and I need to refresh my tableau data extract ( not the view ) every 4 hours. I found out that there are plenty of ways to do it with Tableau online but none with tableau desktop. What I am thinking of, would be to send the keys in a power shell script to Tableau to automatically refresh the .TDE ( which is connected to a psql database ).
Thank you for helping.
Use the tableau command line tool to refresh or append to an extract as often as desired, or call from a script
http://onlinehelp.tableau.com/current/pro/online/mac/en-us/help.html#extracting_TDE.html
Or if necessary, write your own extract utility using the Tableau Data Extract API
You probably will want to save your extract data source separate from workbooks that reference it. That is save the extract to a .tde file and then save your workbook in a .twb file (not a .twbx) unless you have other reasons to package your workbook. That way you can refresh or replace the extract without touching the workbook(s) that reference it.

Widget with search facility in wxwidgets

I was wondering if there were any nice widgets in wxwidgets, with search ability. I mean searching for data in large tables like, data in wxgrid.
Thanks in advance
It would be slow and inefficient in several ways to store all of a large dataset in wxGrid, and then to search through wxGrid.
It would be better to keep the dataset in a database and use the database engine to search through it. wxGrid need only store the data that is visible in the GUI.
Here is some high level pseudo code of how this works.
Load data into store. The store should probably be a database, but it might be a vector or any STL container. Even a text file could be made to work!
Set current viewable row to a sensible value. I am assuming your data is arranged in rows
Load rows including and around current into wxGrid.
User enters search term
Send search request to data store, which returns row containing target
Set current row to row containing target
Load rows including and around current into wxGrid.

Resources