How i can send data from mnesia via websocket - websocket

I have mnesia DB with table artists:
(gw#gw)227> lookup:for_test().
{atomic,["Baltic Baroque","Anna Luca","Karel Boehlee Trio",
"Bill Evans","Dino Saluzzi and Anja Lechner",
"Bill Evans Trio with Stan Getz","Duke Pearson",
"The John Butler Trio"]}
(gw#gw)228>
I want send this list to client by websocket in YAWS, but how i can do it? I broke my mind ... and nothing working. Please help by any info.
Best regards!

You have to convert this list of string to binaries and the pass is across over web socket, by the way what kind of data handling are you doing at the receiving end I mean you want to send this as JSON or just Comma separated values?

Yeeeees ... i solve my problem!
My YAWS Handle
handle_message({text, <<"q2">>}) ->
Var555 = unicode:characters_to_binary(my_json4:handle()),
{reply, {text, <<Var555/binary>>}};
And my mnesia select + convert to JSON. my_json4.erl
Data = [{obj,
[{art_id, Line#tmp.artist_id},
{art, unicode:characters_to_binary(Line#tmp.artist)},
{alb_id, Line#tmp.album_id},
{alb, unicode:characters_to_binary(Line#tmp.album)},
{path, unicode:characters_to_binary(Line#tmp.albumpath)},
{image, unicode:characters_to_binary(Line#tmp.image)},
{tracks, [unicode:characters_to_binary(X) ||X <- Line#tmp.tracks]}]}
|| Line <- Lines],
JsonData = {obj, [{data, Data}]},
rfc4627:encode(JsonData).
handle() ->
QHandle = qlc:q( [ X ||
X <- mnesia:table(artists),
X#artists.artist_id == 2]
),
Records = do(qlc:q([{tmp, X#artists.artist_id, X#artists.artist, A#albums.album_id, A#albums.album, A#albums.albumpath, A#albums.image, A#albums.tracklist} ||
X <- QHandle,
A <- mnesia:table(albums),
A#albums.artist_id == X#artists.artist_id])
),
Json = convert_to_json(Records),
Json.
do(Q) ->
F = fun() ->
qlc:e(Q)
end,
{atomic, Value} = mnesia:transaction(F),
Value.

Related

Edit a datatable in shiny throwing an error : Can't subset with `[` using an object of class NULL

I am trying to edit a data table (DT: datatable) on server side and i am using a reactive data table which is rendered on the front end. Now i want to edit the data table and retrieve the information from edited data table. Here is a reproducible example of what i am trying to achieve as mentioned in the comments in the reprex :
library(shiny)
library(DT)
library(tidyverse)
d <- iris
ui <- fluidPage(
dataTableOutput("table1"),
dataTableOutput("table2")
)
server <- function(input,output,session){
# This is the main table I would want to display in its full context
output$table1 <- renderDataTable(
datatable(d)
)
get_row <-reactive({
d %>% slice(input$table1_rows_selected)})
# Here the table with row selected from table 1 is displayed
output$table2 <- renderDataTable({
datatable(get_row(),
editable = TRUE)
})
# Now as the cell gets edited in table 2, i want the edited value to show and make the last column values = NA
proxy <- dataTableProxy("table2")
observeEvent(eventExpr = input$table2_cell_edit, handlerExpr = {
x <- isolate(get_row())
info = input$table2_cell_edit
i = info$row
j = info$column
v = info$value
x[i, j] <<- DT::coerceValue(v, x[i, j])
x <- x %>% mutate(Species = NA)
replaceData(proxy, x, resetPaging = FALSE)
})
}
shinyApp(ui = ui, server = server)
I am getting an error Error in <<-: object 'x' not found. Not sure where am i wrong.
The solution came down to your input$table2_cell_edit names. It uses col not column. The error created an empty x that couldn't accept your assignment of the new value. I used a simpler assignment of the new value.
In the future, use the print() function to print out variables in your app to figure out what's being passed or not passed downstream. That's how a figured out this error.
library(shiny)
library(DT)
library(tidyverse)
d <- iris
ui <- fluidPage(
dataTableOutput("table1"),
dataTableOutput("table2")
)
server <- function(input,output,session){
# This is the main table I would want to display in its full context
output$table1 <- renderDataTable(
datatable(d)
)
get_row <-reactive({
req(input$table1_rows_selected)
d %>% slice(input$table1_rows_selected)})
# Here the table with row selected from table 1 is displayed
output$table2 <- renderDataTable({
req(get_row)
datatable(get_row(),
editable = TRUE)
})
# Now as the cell gets edited in table 2, i want the edited value to show and make the last column values = NA
proxy <- dataTableProxy("table2")
observeEvent(eventExpr = input$table2_cell_edit, handlerExpr = {
x <- isolate(get_row())
info = input$table2_cell_edit
i = info$row
### info uses 'col' not 'column'
j = info$col
v = info$value
### used a base R subset assignment
x[i, j] <- v
x <- x %>% mutate(Species = NA)
replaceData(proxy, x, resetPaging = FALSE)
})
}
shinyApp(ui = ui, server = server)

How to connect leaflet map clicks (events) with plot creation in a shiny app

Hello I am creating an environmental shiny app in which I want to use a leaflet map to create some simple plots based on openair package(https://rpubs.com/NateByers/Openair).
Aq_measurements() general form
AQ<- (aq_measurements(country = “country”, city = “city”, location = “location”, parameter = “pollutant choice”, date_from = “YYYdateY-MM-DD”, date_to = “YYYY-MM-DD”).
All parameters available in locations dataframe.
worldmet() general form
met <- importNOAA(code = "12345-12345", year = YYYYY:YYYY)
NOAA Code available in locations dataframe
Below I create a sample of my initial data frame:
location = c("100 ail","16th and Whitmore","40AB01 - ANTWERPEN")
lastUpdated = c("2018-02-01 09:30:00", "2018-02-01 03:00:00", "2017-03-07 10:00:00")
firstUpdated = c("2015-09-01 00:00:00","2016-03-06 19:00:00","2016-11-22 15:00:00")
pm25=c("FALSE","FALSE","FALSE")
pm10=c("TRUE","FALSE","FALSE")
no2=c("TRUE","FALSE","FALSE")
latitude=c(47.932907,41.322470,36.809700)
longitude=c(106.92139000,-95.93799000
,-107.65170000)
df = data.frame(location, lastUpdated, firstUpdated,latitude,longitude,pm25,pm10,no2)
As a general idea I want to be able to click on a certain location in the map based on this dataframe. Then I have one selectInput() and 2 dateInput(). The 2 dateInput() should take as inputs the df$firstUpdated and df$lastUpdated respectively. Then the selectInput() should take as inputs the pollutants that exist in the df based on "TRUE"/"FALSE" value. And then the plots should be created. All of these should be triggered by clicking on the map.
Up to now I was not able to achieve this so in order to help you understand I connected the selectInput() and the dateInput() with input$loc which is a selectIpnut() with locations in the first tab as I will not need this when I find the solution.
library(shiny)
library(leaflet)
library(plotly)
library(shinythemes)
library(htmltools)
library(DT)
library(utilr)
library(openair)
library(plotly)
library(dplyr)
library(ggplot2)
library(gissr)
library(ropenaq)
library(worldmet)
# Define UI for application that draws a histogram
ui = navbarPage("ROPENAQ",
tabPanel("CREATE DATAFRAME",
sidebarLayout(
# Sidebar panel for inputs ----
sidebarPanel(
wellPanel(
uiOutput("loc"),
helpText("Choose a Location to create the dataframe.")
)
),
mainPanel(
)
)
),
tabPanel("LEAFLET MAP",
leafletOutput("map"),
wellPanel(
uiOutput("dt"),
uiOutput("dt2"),
helpText("Choose a start and end date for the dataframe creation. Select up to 2 dates")
),
"Select your Pollutant",
uiOutput("pollutant"),
helpText("While all pollutants are listed here, not all pollutants are measured at all locations and all times.
Results may not be available; this will be corrected in further revisions of the app. Please refer to the measurement availability
in the 'popup' on the map."),
hr(),
fluidRow(column(8, plotOutput("tim")),
column(4,plotOutput("polv"))),
hr(),
fluidRow(column(4, plotOutput("win")),
column(8,plotOutput("cal"))),
hr(),
fluidRow(column(12, plotOutput("ser"))
)
)
)
#server.r
# load data
# veh_data_full <- readRDS("veh_data_full.RDS")
# veh_data_time_var_type <- readRDS("veh_data_time_var_type.RDS")
df$location <- gsub( " " , "+" , df$location)
server = function(input, output, session) {
output$pollutant<-renderUI({
selectInput("pollutant", label = h4("Choose Pollutant"),
choices = colnames(df[,6:8]),
selected = 1)
})
#Stores the value of the pollutant selection to pass to openAQ request
###################################
#output$OALpollutant <- renderUI({OALpollutant})
##################################
# create the map, using dataframe 'locations' which is polled daily (using ropenaq)
#MOD TO CONSIDER: addd all available measurements to the popup - true/false for each pollutant, and dates of operation.
output$map <- renderLeaflet({
leaflet(subset(df,(df[,input$pollutant]=="TRUE")))%>% addTiles() %>%
addMarkers(lng = subset(df,(df[,input$pollutant]=="TRUE"))$longitude, lat = subset(df,(df[,input$pollutant]=="TRUE"))$latitude,
popup = paste("Location:", subset(df,(df[,input$pollutant]=="TRUE"))$location, "<br>",
"Pollutant:", input$pollutant, "<br>",
"First Update:", subset(df,(df[,input$pollutant]=="TRUE"))$firstUpdated, "<br>",
"Last Update:", subset(df,(df[,input$pollutant]=="TRUE"))$lastUpdated
))
})
#Process Tab
OAL_site <- reactive({
req(input$map_marker_click)
location %>%
filter(latitude == input$map_marker_click$lat,
longitude == input$map_marker_click$lng)
###########
#call Functions for data retrieval and processing. Might be best to put all data request
#functions into a seperate single function. Need to:
# call importNOAA() to retrieve meteorology data into temporary data frame
# call aq_measurements() to retrieve air quality into a temporary data frame
# merge meteorology and air quality datasets into one working dataset for computations; temporary
# meteorology and air quality datasets to be removed.
# call openAir() functions to create plots from merged file. Pass output to a dashboard to assemble
# into appealing output.
# produce output, either as direct download, or as an emailable PDF.
# delete all temporary files and reset for next run.
})
#fun
output$loc<-renderUI({
selectInput("loc", label = h4("Choose location"),
choices = df$location ,selected = 1
)
})
output$dt<-renderUI({
dateInput('date',
label = 'First Available Date',
value = subset(df$firstUpdated,(df[,1]==input$loc))
)
})
output$dt2<-renderUI({
dateInput('date2',
label = 'Last available Date',
value = subset(df$lastUpdated,(df[,1]==input$loc))
)
})
rt<-reactive({
AQ<- aq_measurements(location = input$loc, date_from = input$dt,date_to = input$dt2,parameter = input$pollutant)
met <- importNOAA(year = 2014:2018)
colnames(AQ)[9] <- "date"
merged<-merge(AQ, met, by="date")
# date output -- reports user-selected state & stop dates in UI
merged$location <- gsub( " " , "+" , merged$location)
merged
})
#DT
output$tim = renderPlot({
timeVariation(rt(), pollutant = "value")
})
}
shinyApp(ui = ui, server = server)
The part of my code that I believe input$MAPID_click should be applied is:
output$map <- renderLeaflet({
leaflet(subset(locations,(locations[,input$pollutant]=="TRUE")))%>% addTiles() %>%
addMarkers(lng = subset(locations,(locations[,input$pollutant]=="TRUE"))$longitude, lat = subset(locations,(locations[,input$pollutant]=="TRUE"))$latitude,
popup = paste("Location:", subset(locations,(locations[,input$pollutant]=="TRUE"))$location, "<br>",
"Pollutant:", input$pollutant, "<br>",
"First Update:", subset(locations,(locations[,input$pollutant]=="TRUE"))$firstUpdated, "<br>",
"Last Update:", subset(locations,(locations[,input$pollutant]=="TRUE"))$lastUpdated
))
})
output$dt<-renderUI({
dateInput('date',
label = 'First Available Date',
value = subset(locations$firstUpdated,(locations[,1]==input$loc))
)
})
output$dt2<-renderUI({
dateInput('date2',
label = 'Last available Date',
value = subset(locations$lastUpdated,(locations[,1]==input$loc))
)
})
rt<-reactive({
AQ<- aq_measurements(location = input$loc, date_from = input$dt,date_to = input$dt2)
met <- importNOAA(year = 2014:2018)
colnames(AQ)[9] <- "date"
merged<-merge(AQ, met, by="date")
# date output -- reports user-selected state & stop dates in UI
merged$location <- gsub( " " , "+" , merged$location)
merged
})
#DT
output$tim = renderPlot({
timeVariation(rt(), pollutant = "value")
})
Here is a minimal example. You click on your marker and you get a plot.
ui = fluidPage(
leafletOutput("map"),
textOutput("temp"),
plotOutput('tim')
)
#server.r
#df$location <- gsub( " " , "+" , df$location)
server = function(input, output, session) {
output$map <- renderLeaflet({
leaflet(df)%>% addTiles() %>% addMarkers(lng = longitude, lat = latitude)
})
output$temp <- renderPrint({
input$map_marker_click$lng
})
output$tim <- renderPlot({
temp <- df %>% filter(longitude == input$map_marker_click$lng)
# timeVariation(temp, pollutant = "value")
print(ggplot(data = temp, aes(longitude, latitude)) + geom_point())
})
}
shinyApp(ui = ui, server = server)

Issue in Facebook Replies download from post comments

I am trying to download public comments and replies from the FACEBOOK public post by page.
my code is working until 5 Feb'18, Now it is showing below error for the "Replies".
Error in data.frame(from_id = json$from$id, from_name = json$from$name, :
arguments imply differing number of rows: 0, 1
Called from: data.frame(from_id = json$from$id, from_name = json$from$name,
message = ifelse(!is.null(json$message), json$message, NA),
created_time = json$created_time, likes_count = json$like_count,
comments_count = json$comment_count, id = json$id, stringsAsFactors = F)
please refer below code I am using.
data_fun=function(II,JJ,page,my_oauth){
test <- list()
test.reply<- list()
for (i in II:length(page$id)){
test[[i]] <- getPost(post=page$id[i], token = my_oauth,n= 100000, comments = TRUE, likes = FALSE)
if (nrow(test[[i]][["comments"]]) > 0) {
write.csv(test[[i]], file = paste0(page$from_name[2],"_comments_", i, ".csv"), row.names = F)
for (j in JJ:length(test[[i]]$comments$id)){
test.reply[[j]] <-getCommentReplies(comment_id=test[[i]]$comments$id[j],token=my_oauth,n = 100000, replies = TRUE,likes = FALSE)
if (nrow(test.reply[[j]][["replies"]]) > 0) {
write.csv(test.reply[[j]], file = paste0(page$from_name[2],"_replies_",i,"_and_", j, ".csv"), row.names = F)
}}}
}
Sys.sleep(10)}
Thanks For Your support In advance.
I had the very same problem as Facebook changed the api rules at the end of January. If you update your package with 'devtools' from Pablo Barbera's github, it should work for you.
I have amended my code (a little) and it works fine now for replies to comments.There is one frustrating thing though, is that Facebook dont appear to allow one to extract the user name. I have a pool of data already so I am now using that to train and predict gender.
If you have any questions and want to make contact - drop me an email at 'robert.chestnutt2#mail.dcu.ie'
By the way - it may not be an issue for you, but I have had challenges in the past writing the Rfacebook output to a csv. Saving output as an .RData file maintains the form a lot better

GA API Filter Failure: Invalid value 'ga:userGender == female' for filters parameter."

I tried to add a filter to extract data only from female users, but encounter the following error:
Invalid value 'ga:userGender == female' for filters parameter.
This is my code:
query.list <- Init(start.date = "2015-10-09", end.date = "2016-10-09", dimensions = "ga:userAgeBracket", metrics = "ga:users,ga:pageviews,ga:sessions", filter = "ga:userGender == female", max.results = 10000, table.id = "ga:xxxxxxxx")
ga.query <- QueryBuilder(query.list) cga.data <- GetReportData(ga.query, oauth_token)
I do not understand what I did wrong -- I thought my code was correct.

how to delete a dataframe wirth BigR?

I have a dataframe created as such (from the BigR tutorials):
airfile <- system.file("extdata", "airline.zip", package="bigr")
airfile <- unzip(airfile, exdir = tempdir())
airR <- read.csv(airfile, stringsAsFactors=F)
# Upload the data to the BigInsights server. This may take 15-20 seconds
air <- as.bigr.frame(airR)
air <- bigr.persist(air, dataSource="DEL",
dataPath="/user/bigr/examples/airline_demo.csv",
header=T, delimiter=",", useMapReduce=F)
Is it possible to delete the dataframe using the bigr library? If so, how?
Ah, I wasn't looking in the right place. I was looking for an operation on the dataset, but a file system operation is required:
bigr.rmfs("/user/bigr/examples/airline_demo.csv", force = F, recursive = T)

Resources