I currently have the following problem:
I have created a narrowcasting client for in a shop, which runs on a Raspberry Pi 2B model. Specs: Quad-core 900MHz ARM Cortex A7-processor (BCM2836 chipset), 1024MB RAM LPDDR2 and 16GB MicroSD is of class 10.
I have installed the latest JDK and extended it with the JavaFX packages (as stated in the first answer here). JavaFX is running now, so that should be fine.
However, when I try to run my application, the animations run very slow (see this video). I am using the code below for the animations:
public void initImagesTransitions() {
EventHandler<ActionEvent> fadeIn = event -> {
TranslateTransition in = new TranslateTransition(Duration.seconds(1), imageHolder);
in.setFromY(-(SystemUtils.getScreenHeight()+100));
in.setToY(-80);
in.setCycleCount(1);
in.play();
imageHolder.setVisible(true);
};
EventHandler<ActionEvent> zoomIn = event -> {
ScaleTransition st = new ScaleTransition(Duration.millis(5000), imageHolder);
st.setByX(0.8f);
st.setByY(0.8f);
st.setCycleCount(1);
st.play();
};
EventHandler<ActionEvent> fadeOut = event -> {
TranslateTransition out = new TranslateTransition(Duration.seconds(1), imageHolder);
out.setToY(500+SystemUtils.getScreenHeight());
out.setCycleCount(1);
out.play();
out.setOnFinished(e -> imageHolder.setVisible(false));
};
Timeline timeline = new Timeline(
new KeyFrame(Duration.ZERO, fadeIn),
new KeyFrame(Duration.seconds(3), zoomIn),
new KeyFrame(Duration.seconds(19), fadeOut)
);
timeline.playFromStart();
}
So nothing special there in my opinion. The Raspberry Pi is only using 30% CPU, so this is also unclear to me. The only other thing running in the background is a like-checker which communicates with my server API every minute.
Anyone any clue on why the animations are slow?
Any help is greatly appreciated!
Just brainstorming here.
Have you tried toying with -Dprism.order= ... settings?
Like =sw or =j2d?
or increase vram on your PI?
(also this does seem relevant:
javafx-very-slow-on-raspberry-pi?
)
Related
I'm trying to send a UDP broadcast on the Hololens but for some reason the thread won't get executed. Interestingly, when I compile it on my old laptop in VS it works?! I compared the installed SDK, settings everything but I cannot figure out why it works when I compile it on my old laptop. It was a mere coincidence that I discovered that in the first place.
This calls the thread:
public void StartThread()
{
// create thread for reading UDP messages
readThread = new Thread(new ThreadStart(ReceiveData));
readThread.IsBackground = true;
readThread.Start();
}
And the thread looks like this:
private void ReceiveData()
{
client = new UdpClient(port);
client.EnableBroadcast = true;
Debug.Log("Thread Started");
while (true)
{
try
....
The debug line won't even get executed. I works in Unity but not on Hololens, except I compile it on an old machine of mine.
Any thoughts? I'm at total loss here.
As derHugo suggested, the thread needed to be started like
public void StartThread()
{
// create thread for reading UDP messages
readThread = new Thread(ReceiveData);
readThread.IsBackground = true;
readThread.Start();
}
That did the trick.
This question is about running a non-blocking, high-performance activity in nativescript that is needed for the simple task of reading and saving raw audio from the microphone by directly accessing the hardware through the native Android API. I believe I have brought the nativescript framework to the edge of its capabilities, and I need experts' help.
I'm building a WAV audio recorder in Nativescript Android. Native implementation is described here (relevant code below).
In short, this can be done by reading audio steam from an android.media.AudioRecord buffer, and then writing the buffer to a file in a separate thread, as described:
Native Android implementation
startRecording() is triggered by a button press, and starts a new Thread that runs writeAudioDataToFile():
private void startRecording() {
// ... init Recorder
recorder.startRecording();
isRecording = true;
recordingThread = new Thread(new Runnable() {
#Override
public void run() {
writeAudioDataToFile();
}
}, "AudioRecorder Thread");
recordingThread.start();
}
Recording is stopped by setting isRecording to false (stopRecording() is triggered by a button press):
private void stopRecording() {
isRecording = false;
recorder.stop();
recorder.release();
recordingThread = null;
}
Reading and saving buffer is stopped if isRecording = false:
private void writeAudioDataToFile() {
// ... init file and buffer
ByteArrayOutputStream recData = new ByteArrayOutputStream();
DataOutputStream dos = new DataOutputStream(recData);
int read = 0;
while(isRecording) {
read = recorder.read(data, 0, bufferSize);
for(int i = 0; i < bufferReadResult; i++) {
dos.writeShort(buffer[i]);
}
}
}
My Nativescript javascript implementation:
I wrote a nativescript typescript code that does the same as the native Android code above. The problem #1 I faced was that I can't run while(isRecording) because the javascript thread would be busy running inside the while loop, and would never be able to catch button clicks to run stopRecording().
I tried to solve problem #1 by using setInterval for asynchronous execution, like this:
startRecording() is triggered by a button press, and sets a time interval of 10ms that executes writeAudioDataToFile():
startRecording() {
this.audioRecord.startRecording();
this.audioBufferSavingTimer = setInterval(() => this.writeAudioDataToFile(), 10);
}
writeAudioDataToFile() callbacks are queued up every 10ms:
writeAudioDataToFile() {
let bufferReadResult = this.audioRecord.read(
this.buffer,
0,
this.minBufferSize / 4
);
for (let i = 0; i < bufferReadResult; i++) {
dos.writeShort(buffer[i]);
}
}
Recording is stopped by clearing the time interval (stopRecording() is triggered by button press):
stopRecording() {
clearInterval(this.audioBufferSavingTimer);
this.audioRecord.stop();
this.audioRecord.release();
}
Problem #2: While this works well, in many cases it makes the UI freeze for 1-10 seconds (for example after clicking a button to stop recording).
I tried to change the time interval that executes writeAudioDataToFile() from 10ms to 0ms and up to 1000ms (while having a very big buffer), but then the UI freezes were longer and, and I experienced loss in the saved data (buffered data that was not saved to the file).
I tried to offload this operation to a separate Thread by using a nativescript worker thread as described here, where startRecording() and stopRecording() are called by messages sent to the thread like this:
global.onmessage = function(msg) {
if (msg.data === 'startRecording') {
startRecording();
} else if (msg.data === 'stopRecording') {
stopRecording();
}
}
This solved the UI problem, but created problem #3: The recorder stop was not executed on time (i.e. recording stops 10 to 50 seconds after the 'stopRecording' msg.data is received by the worker thread). I tried to use different time intervals in the setInterval inside the worker thread (0ms to 1000ms) but that didn't solve the problem and even made stopRecording() be executed with greater delays.
Does anyone have an idea of how to perform such a non-blocking high-performance recording activity in nativescript/javascript?
Is there a better approach to solve problem #1 (javascript asynchronous execution) that I described above?
Thanks
I would keep the complete Java implementation in actual Java, you can do this by creating a java file in your plugin folder:
platforms/android/java, so maybe something like:
platforms/android/java/org/nativescript/AudioRecord.java
In there you can do everything threaded, so you won't be troubled by the UI being blocked. You can call the Java methods directly from NativeScript for starting and stopping the recording. When you build your project, the Java file will automatically be compiled and included.
You can generate typings from your Java class by grabbing classes.jar from the generated .aar file of your plugin ({plugin_name}.aar) and generate type declarations for it: https://docs.nativescript.org/core-concepts/android-runtime/metadata/generating-typescript-declarations
This way you have all the method/class/type information available in your editor.
I'm trying to make a Proof of Concept (C#) for basic graphical maps, as an alternative to Google Maps for Andriod and iOS devices - because Google started charging fees for their APIs (from my understanding only affecting web right now).
I doesn't need to be particularly advanced, simply a GUI that shows a base map where you can draw:
Markers
Lines
Polygons
The only requirements I have is that it should be open-source, or at as low a cost as possible.
What I've done so far is to use data from http://openstreetmap.org - and set up a tile-server https://switch2osm.org/serving-tiles/ on a separate linux machine.
Furthermore, it went fairly quick to create a simple web app with OpenLayers.js and Leaflet.js connected to the custom tile-server with the requirments met.
What I need to do now is to find a free or cheap mobile SDK for Xamarin for Android and iOS. I managed to render a map from my own tile-server and add markers by referring .dll's from this zip from 2014 (only tested for Andriod): https://github.com/OsmSharp/ui/releases/tag/v4.2.0.723
using OsmSharp.Android.UI;
using OsmSharp.Android.UI.Data.SQLite;
using OsmSharp.Math.Geo;
using OsmSharp.UI.Map;
using OsmSharp.UI.Map.Layers;
[Activity(Label = "#string/app_name", MainLauncher = true)]
public class MainActivity : AppCompatActivity
{
private MapView _mapView { get; set; }
private Layer _mapLayer { get; set; }
protected override void OnCreate(Bundle savedInstanceState)
{
base.OnCreate(savedInstanceState);
try
{
Native.Initialize();
_mapView = new MapView(this, new MapViewSurface(this))
{
MapTilt = 0,
MapCenter = new GeoCoordinate(lat, long),
MapZoom = 16,
Map = new Map()
};
// create a marker under Resources/drawable/pin.png
using (var bitmap = BitmapFactory.DecodeResource(Resources, Resource.Drawable.pin))
{
var marker = new MapMarker(this, new GeoCoordinate(lat, long), MapMarkerAlignmentType.CenterBottom, bitmap);
_mapView.AddMarker(marker);
}
_mapLayer = _mapView.Map.AddLayerTile("http://*.*.*.*/{0}/{1}/{2}.png");
SetContentView(_mapView);
}
catch (Exception ex)
{
Console.WriteLine(ex);
}
}
}
However, these .dlls seemed to lack support for Lines and Polygons. I have tried to get something similar to work with OsmSharp's latest NuGet package (2018-06-04), but my novice Xamarin experiance only gets me so far.
Does anyone have any tips on how to use my own tile-server and render native maps on Android and iOS devices?
PS. It doen't strictly need to be OpenStreetMap with OsmSharp connected to a custom tile-server, that's just something im leaning towards right now. Again, requirents are open-source or at a low cost with the fexebility to add Markers, Lines and Polygons.
Everytime MediaPlayer.Play() is executed from the UI thread the UI freezes for a significant amount of time. I don't think you can do anything about the time it takes to start playing the SongCollection but at least the UI should stay responsive.
Running MediaPlayer.Play() from another thread obviously doesn't work.
The MediaPlayer is a component from the Xna Namespace. If you are using this feature in a game, you are most certain running a GameLoop to prevent this freeze from happening: GameLoop
If you use this component in an App, you can simulate this behavior your own
public MainPage()
{ InitializeComponent();
// Timer to simulate the XNA Game Studio game loop (Microphone is from XNA Game Studio)
DispatcherTimer dt = new DispatcherTimer();
dt.Interval = TimeSpan.FromMilliseconds(33);
dt.Tick += delegate { try { FrameworkDispatcher.Update(); } catch { } };
dt.Start();
}
(see complete sample on how to run a microphone outside a gameloop: msdn)
I'm trying to create a 3d renderer for stereo vision with quad buffering with Processing/Java. The hardware I'm using is ready for this so that's not the problem.
I had a stereo.jar library in jogl 1.0 working for Processing 1.5, but now I have to use Processing 2.0 and jogl 2.0 therefore I have to adapt the library.
Some things are changed in the source code of Jogl and Processing and I'm having a hard time trying to figure out how to tell Processing I want to use quad buffering.
Here's the previous code:
public class Theatre extends PGraphicsOpenGL{
protected void allocate()
{
if (context == null)
{
// If OpenGL 2X or 4X smoothing is enabled, setup caps object for them
GLCapabilities capabilities = new GLCapabilities();
// Starting in release 0158, OpenGL smoothing is always enabled
if (!hints[DISABLE_OPENGL_2X_SMOOTH])
{
capabilities.setSampleBuffers(true);
capabilities.setNumSamples(2);
}
else if (hints[ENABLE_OPENGL_4X_SMOOTH])
{
capabilities.setSampleBuffers(true);
capabilities.setNumSamples(4);
}
capabilities.setStereo(true);
// get a rendering surface and a context for this canvas
GLDrawableFactory factory = GLDrawableFactory.getFactory();
drawable = factory.getGLDrawable(parent, capabilities, null);
context = drawable.createContext(null);
// need to get proper opengl context since will be needed below
gl = context.getGL();
// Flag defaults to be reset on the next trip into beginDraw().
settingsInited = false;
}
else
{
// The following three lines are a fix for Bug #1176
// http://dev.processing.org/bugs/show_bug.cgi?id=1176
context.destroy();
context = drawable.createContext(null);
gl = context.getGL();
reapplySettings();
}
}
}
This was the renderer of the old library. In order to use it, I needed to do size(100, 100, "stereo.Theatre").
Now I'm trying to do the stereo directly in my Processing sketch. Here's what I'm trying:
PGraphicsOpenGL pg = ((PGraphicsOpenGL)g);
pgl = pg.beginPGL();
gl = pgl.gl;
glu = pg.pgl.glu;
gl2 = pgl.gl.getGL2();
GLProfile profile = GLProfile.get(GLProfile.GL2);
GLCapabilities capabilities = new GLCapabilities(profile);
capabilities.setSampleBuffers(true);
capabilities.setNumSamples(4);
capabilities.setStereo(true);
GLDrawableFactory factory = GLDrawableFactory.getFactory(profile);
If I go on, I should do something like this:
drawable = factory.getGLDrawable(parent, capabilities, null);
but drawable isn't a field anymore and I can't find a way to do it.
How do I set quad buffering?
If I try this:
gl2.glDrawBuffer(GL.GL_BACK_RIGHT);
it obviously doesn't work :/
Thanks.