shiny DT::renderdatatable - datatable

suppose the columns of a data table are: unique-ID, name, salary, position.
I display the table in a shiny application using DT::renderdatatable (DT::dataTableOutput).
I would like to click on a row of the output to display other data of the person belonging to the ID in another output.
what is the solution?
in short, how do I extract the unique-ID from a clicked line item?

You can use the _rows_selected extension coming with DT tables. Here's the list of possible arguments. Several live examples are here or here.
This is a simple example of a plot that updates with the lines selected in the table:
library(shiny)
library(DT)
ui <- fluidPage(
DT::dataTableOutput("test_table"),
plotOutput("test_plot")
)
server <- function(input, output, session) {
output$test_table <- DT::renderDataTable({
mtcars
})
output$test_plot <- renderPlot({
s <- input$test_table_rows_selected
if (!is.null(s)) {
plot(mtcars[s, "disp"])
}
})
}
shinyApp(ui, server)

Related

Shiny app - aggregate data set by selection_filter and create new variables

I am quite new to R and have been trying to find a solution for my problem since weeks. I hope someone can help me.
1.I want to develop a shiny app in a dashboard, where the user can select values via selection_filter (e.g. out of the variable "age group" the value "40-49 years" and from "sex" the value "female"). Based on these selections, columns (e.g. column x,y, and z) from the original dataset will be aggregated. I already wrote a function using aggregate().
2.Based on the aggregated columns, new values shall be calculated (e.g. d=(x-y)/(z/2)).
3.The aggregated columns and the newly calculated values shall be displayed in a table to the user.
The function from 1)
aggreg.function <- function(a,b,c) {
agg.data<- aggregate(cbind(x,y,z), shared_Cervix, sum,
subset=c(!AgeGroup %in% a & !Sex %in% b & !Edition %in% c))
#Calculate new values
agg.data$d<- agg.data$x+agg.data$y
agg.data$f<- (agg.data$x+agg.data$y)/(agg.data$z/2)
View(m.agg.data)
}
user_data<- reactive({
aggreg.function(input$AgeGroup, input$Sex, input$Edition)
})
EDIT
Thanks for the recommendations. I change my code but now I struggle a bit with adding new columns. In total, I want to insert 17 new columns based on the filtered table (data_step2()). Is there a way to insert multiple columns at the same time. In my example: is it possible to combine data_step3 and data_step4?
ui<-fluidPage(
selectInput("Age","Age:",sort(unique(Complete_test$Age))),
selectInput("Raced","Race:",sort(unique(Complete_test$Race))),
selectInput("Stage","Stage:",sort(unique(Complete_test$Stage))),
selectInput("Grade","Grade:",sort(unique(Complete_test$Grade))),
selectInput("Edition","Edition:",sort(unique(Complete$Edition))),
DT::dataTableOutput("filtered.result")
)
server = function(input, output) {
data_step1 <- reactive({
Complete%>% filter(Age %in% input$Age & Stage %in% input$Stage & Grade %in% input$Grade & Race %in% input$Race & Edition%in% input$_Edition)})
data_step2 <- reactive({
data_step1() %>% group_by(Age, Stage, Grade, Race, Edition, Year ) %>% summarise(across(everything(), sum))
})
#is it possible to combine data_step3 and data_step_4 ?
data_step3 <- reactive({
data_step2() %>% mutate(xy=x+y)
})
data_step4 <- reactive({
data_step3() %>% mutate(w=xy/(x2))
})
output$filtered.result <- DT::renderDataTable({
data_step4()
})
}
shinyApp(ui, server)
```
Here's something you might be able to do for step 1 of your question, although it's hard to tell what your data might look like and what you end result should look like. I'm assuming that you want to allow a user of your dashboard to select the AgeGroup and Sex variables to view data, so in the UI of the application, two selectInput functions were used to provide that functionality. For the server side, two reactive statements are used to filter the data as a user changes the inputs. In each one, an input is required and then the dataset is filtered by the input. Notice in the second reactive statement, "data_step1()" is used instead of "data_step1"; this allows the reactivity of these statements to continue to be updated as a user changes inputs.
For step 2 and 3, you'll want to use the "data_step2()" dataset to add the new columns and then you can use a function such as renderDataTable to display the output on your dashboard.
library(shiny)
library(dplyr)
ui <- fluidPage(
selectInput("AgeGroup", "AgeGroup:", sort(unique(Data$AgeGroup))),
selectInput("Sex", "Sex:", sort(unique(Data$Sex)))
)
server <- function(input,output,session){
data_step1 <- reactive({
req(input$AgeGroup) # Require input
data %>% filter(AgeGroup %in% input$AgeGroup)}) # Filter full dataset by AgeGroup input
data_step2 <- reactive({
req(input$Sex) # Require input
data_step1() %>% filter(Sex %in% input$Sex)}) # Filter full dataset by Sex input
# Do next steps with the filtered data_step2() dataset...
}
# Run the application
shinyApp(ui = ui, server = server)

Bring value of A1 from all active sheet in to Master report

Each sheet represents the client profile with name in F1 and email address in F3. We add a new sheet for any new client. I need to have a Master Report which will show values of F1 of all sheets (names of clients) in column A and values of F3 of all sheets(email addresses) in column B. Report needs to be updated once a new client is added.
Thank you.
You can use this custom formula:
function getF1F3Data() {
var sheets = SpreadsheetApp.getActiveSpreadsheet().getSheets()
.filter(sheet => sheet.getSheetName() != 'main');
return sheets.map(sheet => [sheet.getRange("F1").getValue(), sheet.getRange("F3").getValue()]);
}
Matches all sample data put into Sheet2 and Sheet3.

Is there a way to create a data vaildation script for multiple rows

I am setting up a spreadsheet in Google Sheets and i need Data Validation to make a list of items in Column E to pull data in Columns K:N.
I have tried the Data Validation to pull the cell down but it copies the information in the previous row.
https://docs.google.com/spreadsheets/d/1q6laBJgtsZ8famEV9tbQI0MsRmwE8Lbs0HB-2AuTt7I/edit?usp=sharing
Thank You
I played around with this idea this morning and you're welcome to use this simple script.
Just before you run it you first pick the range to validate and then the range that has the validation values. Play with it before you use it on a working spreadsheet.
function makeValidation() {
var ss=SpreadsheetApp.getActive();
var sh=ss.getActiveSheet();
var rgA=sh.getActiveRangeList().getRanges();
if(rgA.length==2) {
rgA[0].setDataValidation(SpreadsheetApp.newDataValidation().requireValueInRange(rgA[1]));
}else{
SpreadsheetApp.getUi().alert('Invalid RangeList. Only two ranges at a time please. The range to validate first and the range of values second.')
}
}

Unable to access documenst containg predicted label

I am using RTextTools to execute a model I previously trained. I am unable to extract the document and its predicted label. I have tried looking at analytics#document_summary and inspecting te container, but need some help getting the approximately 25K documenst and the predicted labels into a data frame for subsequent processing. Here is the relevant snippett of code:
preddata = read form csv
predsize = nrow(preddata)
preddata$topic <- rep(NA,predsize)
predmatrix <- create_matrix(preddata, originalMatrix=dtMatrix)
predcontainer <- create_container(predmatrix, preddata$topic, testSize=1:predsize, virgin=TRUE)
predresults <- classify_models(predcontainer, models)

Writing arbitrary R objects to SQLite database

I'm trying to store large list objects created in R to an SQLite database via RSQLite. Since These list objects contain several 2d and 3d matrices, I'd like store them as individual entries. I read serializing these and storing them as blobs does the trick.
The Problem is however that my code does not appear to store the blobs as individual rows, instead rather storing each separate byte as row. Here is my code:
library(RSQLite)
out1 <- serialize(model1,NULL)
out2 <- serialize(model2,NULL)
out3 <- serialize(model3,NULL)
model4 <- serialize(rnorm(10),NULL)
model5 <- serialize(rnorm(20),NULL)
model6 <- serialize(rnorm(30),NULL)
db <- dbConnect(SQLite(), dbname="Test.sqlite")
dbGetQuery(conn = db,
"CREATE TABLE IF NOT EXISTS models
(_id INTEGER PRIMARY KEY AUTOINCREMENT,
model BLOB)")
test4 <- data.frame(g=I(model4))
test5 <- data.frame(g=I(model5))
test6 <- data.frame(g=I(model6))
dbGetPreparedQuery(db, "INSERT INTO models (model) values (:g)", bind.data=test4)
dbGetPreparedQuery(db, "INSERT INTO models (model) values (:g)", bind.data=test5)
dbGetPreparedQuery(db, "INSERT INTO models (model) values (:g)", bind.data=test6)
dbListTables(db)
p1 = dbGetQuery( db,'select * from models' )
Also, while the writing process works fine in this case, it is incredibly slow with files larger than 1000kb...

Resources