WebRTC ICE Candidate could not be added error - websocket

I am getting this error when I addICECandidate
var candidate = new RTCIceCandidate({sdpMLineIndex:message.label,
candidate:message.candidate});
pc.addIceCandidate(candidate);
My candidate is forming correctly still I am getting this error.
Error is:
Failed to execute 'addIceCandidate' on 'RTCPeerConnection': The ICE candidate could not be added.
My issue is while creating offer pc.createOffer(setLocalAndSendMessage, onSignalingError, sdpConstraints); here nothing happens neither setLocalAndSendMessage is called neither onSignalingError thats why later on candidate could not be added.

Related

AutoMLSearch with EvalML returning an error

I am getting following error message while trying to run AutoMLSearch with EvalML.
"All pipelines in the current AutoML batch produced a score of np.nan on the primary objective <evalml.objectives.standard_metrics.LogLossBinary object at 0x7f74defbe790>."
I tried the following solution to rectify this, still no use.
https://github.com/alteryx/evalml/issues/3154
Any suggestions?

Compilation problem while running the sdm package in Rstudio

I am getting this error when compiling:
Error in FUN(X[[i]], ...): trying to get slot "presence" from an object of a basic class ("NULL") with no slots
How can I solve this?
You should check the formula to see whether you are indicating the same name
ie
sdmdata<- sdmData(**species~.,** train, test, predictors, bg..)
Writing this in the model will give you an error you described.
sdmmodel<- sdm(**specie~.,** data= methods=c("glm", "brt")).
I solved a similar problem that way

How do I set up a State Store for a Transformer

I'm trying to create a Transformer, and running into problms with the initialization of its StateStore. I looked at the example in How to register a stateless processor (that seems to require a StateStore as well)?
and it makes sense, but I'm trying something different:
KeyValueBytesStoreSupplier groupToKVStore_supplier =
Stores.persistentKeyValueStore( state_store_name );
StoreBuilder< KeyValueStore< G, KeyValue< K, V > > > groupToKVStore_builder =
Stores.keyValueStoreBuilder( groupToKVStore_supplier, Gserde, KVserde );
stream_builder.addStateStore( groupToKVStore_builder );
My intention is to use a String as the State Store key and a KeyValue as the State Store value. Is the formulation above correct? I'm asking because when the stream containing my Transformer is starting up, it throws an exception that says:
Caused by: org.apache.kafka.streams.errors.TopologyBuilderException: Invalid topology building: Processor KSTREAM-TRANSFORM-0000000001 has no access to StateStore state_store_1582785598
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.getStateStore(ProcessorContextImpl.java:72)
at com.ui.streaming.processors.sort.WindowedTimeSorter.init(WindowedTimeSorter.java:135)
at org.apache.kafka.streams.kstream.internals.KStreamTransform$KStreamTransformProcessor.init(KStreamTransform.java:51)
at org.apache.kafka.streams.processor.internals.ProcessorNode$2.run(ProcessorNode.java:54)
at org.apache.kafka.streams.processor.internals.StreamsMetricsImpl.measureLatencyNs(StreamsMetricsImpl.java:208)
at org.apache.kafka.streams.processor.internals.ProcessorNode.init(ProcessorNode.java:10
Per Matthias' suggestion, I added a StateStore name argument to the transform invocation in my Stream, and that appears to get us past the error shown above. However, we then get the following exception:
ERROR stream-thread [A.Completely.Different.appID-b04af4b4-fdbb-4353-9aa5-6d71f7c22f9e-StreamThread-1] Failed to process stream task 0_1 due to the following error: (org.apache.kafka.streams.processor.internals.AssignedStreamsTasks:105)
java.lang.IllegalStateException: This should not happen as timestamp() should only be called while a record is processed
at org.apache.kafka.streams.processor.internals.AbstractProcessorContext.timestamp(AbstractProcessorContext.java:153)
at org.apache.kafka.streams.state.internals.StoreChangeLogger.logChange(StoreChangeLogger.java:59)
at org.apache.kafka.streams.state.internals.ChangeLoggingKeyValueBytesStore.put(ChangeLoggingKeyValueBytesStore.java:69)
at org.apache.kafka.streams.state.internals.ChangeLoggingKeyValueBytesStore.put(ChangeLoggingKeyValueBytesStore.java:29)
at org.apache.kafka.streams.state.internals.InnerMeteredKeyValueStore.put(InnerMeteredKeyValueStore.java:198)
at org.apache.kafka.streams.state.internals.MeteredKeyValueBytesStore.put(MeteredKeyValueBytesStore.java:117)
at com.ui.streaming.processors.sort.WindowedTimeSorter.transform(WindowedTimeSorter.java:167)
at com.ui.streaming.processors.sort.WindowedTimeSorter.transform(WindowedTimeSorter.java:1)
at org.apache.kafka.streams.kstream.internals.KStreamTransform$KStreamTransformProcessor.process(KStreamTransform.java:56)
Alas, things are still not quite right: First off, my Transformer's init method is being called three times; it should only be once, rigt? Second, I'm getting a runtime error in my Transformer's transform method the first time it tries to store something into the StateStore:
INFO stream-thread [A.Completely.Different.appID-7dc67466-20f4-4e6c-8a69-bc0710a42f3c-StreamThread-1] Shutdown complete (org.apache.kafka.streams.processor.internals.StreamThread:1124)
Exception in thread "A.Completely.Different.appID-7dc67466-20f4-4e6c-8a69-bc0710a42f3c-StreamThread-1" java.lang.IllegalStateException: This should not happen as timestamp() should only be called while a record is processed
at org.apache.kafka.streams.processor.internals.AbstractProcessorContext.timestamp(AbstractProcessorContext.java:153)
at org.apache.kafka.streams.state.internals.StoreChangeLogger.logChange(StoreChangeLogger.java:59)
at org.apache.kafka.streams.state.internals.ChangeLoggingKeyValueBytesStore.put(ChangeLoggingKeyValueBytesStore.java:69)
at org.apache.kafka.streams.state.internals.ChangeLoggingKeyValueBytesStore.put(ChangeLoggingKeyValueBytesStore.java:29)
at org.apache.kafka.streams.state.internals.InnerMeteredKeyValueStore.put(InnerMeteredKeyValueStore.java:198)
at org.apache.kafka.streams.state.internals.MeteredKeyValueBytesStore.put(MeteredKeyValueBytesStore.java:117)
at com.ui.streaming.processors.sort.WindowedTimeSorter.transform(WindowedTimeSorter.java:155)
Just adding the store to the topology is not sufficient. You additionally need to connect the store to the transformer by passing the store name into transform():
stream.transform(..., state_store_name);
Update:
For the second exception, I assume that you don't return a new object when TransformerSupplier#get() is called, but you return the same object each time. As the "supplier pattern" suggests, you need to create a new object each time #get() is called (otherwise, a supplier would not make sense and it would be possible to hand in a single object directly). Compare the FAQ: https://docs.confluent.io/current/streams/faq.html#why-do-i-get-an-illegalstateexception-when-accessing-record-metadata

OS X Swift Compiler Error - Segmentation Fault

I am making some variable declarations in a NSViewController custom class. The declarations are:
var filterSettings: Dictionary<String, String> = ["Location": "All", "Status": "All", "PPRDate": "All", "Project Manager": "All"]
let locationFilterSettings: Set = ["All", "Newcastle", "Sydney", "ACT & Southern NSW", "Western Sydney", "Grafton"]
let statusFilterSettings: Set = ["All", "Active", "Inactive"]
var PPRDateFilterSettings: Set<NSDate> = [] // this value needs to be set programiticaly by loading up the available PPR Dates --- use PPRDateFilterSettings.insert(dateVariable)
var projectManagerFilterSettings: Set<String> = [] // this value needs to be set programatically by loading up the available PMs
When the program compiles I get one error that shows up in the issues navigator: - a compiler error is not shown against any particular line in the code.
When I go to the issue navigator it shows against this class the following error. All other classes compile correctly with no errors:
"Swift Compiler Error Command failed due to signal: Segmentation fault: 11"
I admit to not knowing how to debug this error.
I do know that if I comment out the let locationFilterSettings.. line in the code that the compiler error goes away.
I have just added this code for the variables shown above and do make any other reference to the filterSettings valuable yet. No other changes have been made to the code which was compiling and running as expected.
Any advice on where/how to debug the issue please let me know. I am not sure what to do next.
I should add that I am running the latest version of Xcode and OSX.
I have also tried playing with optional declaration as suggested in one of the answers here:-->Swift compiler segmentation fault when building but to no avail.
EDIT: Some additional information.
I deleted and re-installed Xcode. The error still occurred.
Having declared the variables within the class I wasn't actually referencing them within any functions so I tried println the variables at a few spots in the code. The error still occurred.
I moved the declarations from the global level within the class to within one of the functions. The error disappeared.
So- three above partially solved the issue for me. I wanted the variables to be available through the class so now I may need to pass them around as parameters (which seems to work). However, I still do not understand why the error was occurring and if it was a syntax thing that I was missing.
Ok - I have now been able to compile the code without an error with the properties declared at the top of the Class.
The issue was the use of the short form declaration relying on the type of item being inferred.
let propertyName: Set = ["item1", "item2"]
when I initialised the property using the following syntax
let propertyName: Set<String> = ["item1", "item2"]
it compiled without an error. The short form declaration worked when the property was declared within a function.

'clusters' in igraph unable to find

g=get.adjacency(erdos.renyi.game(100,0.5))
clusters(g)
Error in (function (classes, fdef, mtable) : unable to find an
inherited method for function ‘clusters’ for signature ‘"dgCMatrix",
"missing"’
clusters(graph.adjacency(g,mode='undirected',weighted=T))
Error in (function (classes, fdef, mtable) : unable to find an
inherited method for function ‘clusters’ for signature ‘"igraph",
"missing"’
For some reason first it condsiders g as sparse matrix, then when I change it to proper graph objects, it just cannot find the funciton. Until few days ago it was working perfectly, and then on Friday after lunch it just stopped working and I started obtaining these error messages. I would be grateful to anyone that could have an idea about this issue. It seems the function is not in the package anymore, as when I search for it, it just does not appear to be there.

Resources