How to add a mp3 in scalafx gui scene? - scalafx

I trying to add a mp3 in to my scala gui using scalafx, but i have trouble how to add in to the scene
this is what i have, but it doesn't work...
val gameStage = new PrimaryStage {
title = "Game Graphics"
scene = new Scene(windowWidth, windowHeight) {
var audio = new Media(url)
var mediaPlayer = new MediaPlayer(audio)
mediaPlayer.volume = 100
mediaPlayer.play()
}
}

It appears that one problem is that you have not used a MediaView instance to add the MediaPlayer to the scene. Also, it's probably better if you do not start to play the media until the scene has been displayed.
I think you need something like this (as a complete app):
import scalafx.application.JFXApp
import scalafx.application.JFXApp.PrimaryStage
import scalafx.scene.{Group, Scene}
import scalafx.scene.media.{Media, MediaPlayer, MediaView}
object GameGraphics
extends JFXApp {
// Required info. Populate as necessary...
val url = ???
val windowWidth = ???
val windowHeight = ???
// Initialize the media and media player elements.
val audio = new Media(url)
val mediaPlayer = new MediaPlayer(audio)
mediaPlayer.volume = 100
// The primary stage is best defined as the stage member of the application.
stage = new PrimaryStage {
title = "Game Graphics"
width = windowWidth
height = windowHeight
scene = new Scene {
// Create a MediaView instance of the media player, and add it to the scene. (It needs
// to be the child of a Group, or the child of a subclass of Group).
val mediaView = new MediaView(mediaPlayer)
root = new Group(mediaView)
}
// Now play the media.
mediaPlayer.play()
}
}
Also, you should prefer val to var, particularly if there is no need to modify the associated variables after they have been defined.
BTW, it's not possible to test your code, so please consider posting a minimal, complete and verifiable example next time.

Related

Updating Texture2D frequently causes process to crash (UpdateSubresource)

I am using SharpDX to basically render browser (chromium) output buffer on directX process.
Process is relatively simple, I intercept CEF buffer (by overriding OnPaint method) and write that to a texture2D.
Code is relatively simple:
Texture creation:
public void BuildTextureWrap() {
var oldTexture = texture;
texture = new D3D11.Texture2D(DxHandler.Device, new D3D11.Texture2DDescription() {
Width = overlay.Size.Width,
Height = overlay.Size.Height,
MipLevels = 1,
ArraySize = 1,
Format = DXGI.Format.B8G8R8A8_UNorm,
SampleDescription = new DXGI.SampleDescription(1, 0),
Usage = D3D11.ResourceUsage.Default,
BindFlags = D3D11.BindFlags.ShaderResource,
CpuAccessFlags = D3D11.CpuAccessFlags.None,
OptionFlags = D3D11.ResourceOptionFlags.None,
});
var view = new D3D11.ShaderResourceView(
DxHandler.Device,
texture,
new D3D11.ShaderResourceViewDescription {
Format = texture.Description.Format,
Dimension = D3D.ShaderResourceViewDimension.Texture2D,
Texture2D = { MipLevels = texture.Description.MipLevels },
}
);
textureWrap = new D3DTextureWrap(view, texture.Description.Width, texture.Description.Height);
if (oldTexture != null) {
obsoleteTextures.Add(oldTexture);
}
}
That piece of code is executed at start and when resize is happening.
Now when CEF OnDraw I basically copy their buffer to texture:
var destinationRegion = new D3D11.ResourceRegion {
Top = Math.Min(r.dirtyRect.y, texDesc.Height),
Bottom = Math.Min(r.dirtyRect.y + r.dirtyRect.height, texDesc.Height),
Left = Math.Min(r.dirtyRect.x, texDesc.Width),
Right = Math.Min(r.dirtyRect.x + r.dirtyRect.width, texDesc.Width),
Front = 0,
Back = 1,
};
// Draw to the target
var context = targetTexture.Device.ImmediateContext;
context.UpdateSubresource(targetTexture, 0, destinationRegion, sourceRegionPtr, rowPitch, depthPitch);
There are some more code out there but basically this is only relevant piece. Whole thing works until OnDraw happens frequently.
Apparently if I force CEF to Paint frequently, whole host process dies.
This is happening at UpdateSubresource.
So my question is, is there another, safer way to do this? (Update texture frequently)
Solution to this problem was relatively simple yet not so obvious at the beginning.
I simply moved the code responsible for updating texture inside render loop and just keep internal buffer pointer cached.

ScalaFX how to close a secondary stage

How to close a secondary stage, which is utilized as an auxiliary window? So I have a primaryStage used as the UI platform and occasionally I need to open a secondary window which is pretty straight forward, but to close it in a method is not clear.
Here's how secondary stage is engaged:
val ivbox = new VBox(children = new Label("Create New Album"))
val stackpane = new StackPane()
sp.children = Seq(ivbox)
val secondstage = new Stage() {
title = "second stage"
scene = new Scene(stackpane, 450, 150) {
stylesheets += getClass.getResource("uistyle.css").toExternalForm
}
x = myproto.stage.x.value + 200 // position in relation to primaryStage / scene
y = myproto.stage.y.value + 100 //
}
In JavaFX I found this clip:
private void closeButtonAction(){
Stage stage = (Stage) closeButton.getScene().getWindow();
stage.close();
}
Not clear how to come by in ScalaFx?
There is no difference in ScalaFX. You close stage using close(). Code is exactly the same in ScalaFX:
stage.close()

AVCapturePhotoSettings not accepting accept NSDictionary element

not sure what I am doing wrong, I wanna create a simple custom camera, I'm creating the AVCapturePhotoOutput attaching it to AVCaptureSession, then creating AVCapturePhotoSettings with minimum settings to make taking a picture work, see code below.
I get exception kCVPixelBufferPixelFormatTypeKey is not being define but it is indeed in the NSDictionary I am passing.
I need some light here, thanks
public void TakePicture()
{
var output = new AVCapturePhotoOutput();
_captureSession.AddOutput(output);
var settings = AVCapturePhotoSettings.Create();
var previewPixelType = settings.AvailablePreviewPhotoPixelFormatTypes.First();
var keys = new[]
{
new NSString("kCVPixelBufferPixelFormatTypeKey"),
new NSString("kCVPixelBufferWidthKey"),
new NSString("kCVPixelBufferHeightKey"),
};
var objects = new NSObject[]
{
// don't have to be strings... can be any NSObject.
previewPixelType,
new NSString("160"),
new NSString("160")
};
var dicionary = new NSDictionary<NSString, NSObject>(keys, objects);
settings.PreviewPhotoFormat = dicionary;
output.CapturePhoto(settings,this);
}
It is because kCVPixelBufferPixelFormatTypeKey is not available in Xamarin.
We should use CVPixelBuffer.PixelFormatTypeKey here . It will be convert to kCVPixelBufferPixelFormatTypeKey automatically when compiling.
The same reason for kCVPixelBufferWidthKey and kCVPixelBufferHeightKey , the api is CVPixelBuffer.WidthKey and CVPixelBuffer.HeightKey in Xamarin.iOS.

Haxe Type Not Found

I'm trying to run the most basic Haxe program but keep getting errors.
The Main.hx file looks like this:
package;
import flash.display.Sprite;
import flash.display.StageAlign;
import flash.display.StageScaleMode;
import flash.events.Event;
import flash.Lib;
import flixel.FlxGame;
import flixel.FlxState;
class Main extends Sprite {
var gameWidth:Int = 640; // Width of the game in pixels (might be less / more in actual pixels depending on your zoom).
var gameHeight:Int = 480; // Height of the game in pixels (might be less / more in actual pixels depending on your zoom).
var initialState:Class<FlxState> = MenuState; // The FlxState the game starts with.
var zoom:Float = -1; // If -1, zoom is automatically calculated to fit the window dimensions.
var framerate:Int = 60; // How many frames per second the game should run at.
var skipSplash:Bool = false; // Whether to skip the flixel splash screen that appears in release mode.
var startFullscreen:Bool = false; // Whether to start the game in fullscreen on desktop targets
// You can pretty much ignore everything from here on - your code should go in your states.
public static function main():Void
{
Lib.current.addChild(new Main());
}
public function new()
{
super();
if (stage != null)
{
init();
}
else
{
addEventListener(Event.ADDED_TO_STAGE, init);
}
}
private function init(?E:Event):Void
{
if (hasEventListener(Event.ADDED_TO_STAGE))
{
removeEventListener(Event.ADDED_TO_STAGE, init);
}
setupGame();
}
private function setupGame():Void
{
var stageWidth:Int = Lib.current.stage.stageWidth;
var stageHeight:Int = Lib.current.stage.stageHeight;
if (zoom == -1)
{
var ratioX:Float = stageWidth / gameWidth;
var ratioY:Float = stageHeight / gameHeight;
zoom = Math.min(ratioX, ratioY);
gameWidth = Math.ceil(stageWidth / zoom);
gameHeight = Math.ceil(stageHeight / zoom);
}
addChild(new FlxGame(gameWidth, gameHeight, initialState, zoom, framerate, framerate, skipSplash, startFullscreen));
}
}
Just the generic template file. When I run it in Terminal (running Mac OS X El Capitan), I get this error:
Main.hx:8: characters 7-21 : Type not found : flixel.FlxGame
Haven't had problems with the installations or anything and I am new to Haxe so I don't know where to start. Any ideas?
Thanks :)
Did you add the library when you try to run your game ?
You can do that by using the command line haxe -lib flixel -main Main ....
Or by writting an hxml file containing all your CLI arguments :
-lib flixel
-main Main
Update after #Gama11 comment :
HaxeFlixel used the OpenFL format for the compilation information (see http://www.openfl.org/documentation/projects/project-files/xml-format/).
So you should include include flixel library using : <haxelib name="flixel" />in your Project.xml file.

Difficulty attaching an event to a programmatically created MovieClip

This has been really bugging me and I can't seem to ask the right question while searching for a solution! The problem is simple enough: Why can I not attach an event to a programmatically created MovieClip in Actionscript 3? This has got to be something basic I am missing, right?
package {
//imports
import flash.display.*;
import flash.events.*;
//document class
public class Document extends MovieClip {
//variables
public var utilities:Object;
public var test:MovieClip;
//constructor
public function Document()
{
//instantiate the utilities object - this will contain some
//simple functions for setting dimensions and background color of
//a display object.
this.utilities = new Object();
//utility function: set dimensions.
this.utilities.setDimensions = function(what:*, width:uint, height:uint)
{
what.graphics.drawRect(0, 0, width, height);
}
//utility function: set opaque background color.
this.utilities.setBgColor = function(what:DisplayObject, color:uint)
{
what.cacheAsBitmap = true;
what.opaqueBackground = color;
}
//create/add a test movie clip.
test = new MovieClip();
//set dimensions and background color of the test movie clip.
this.utilities.setDimensions(test, 100, 100);
this.utilities.setBgColor(test, 0xff0000);
//add test movie clip to document class.
this.addChild(test);
//add a click event to the test movie clip.
test.addEventListener(MouseEvent.CLICK, onClick);
}
//click event handler
private function onClick(evt:Event):void
{
trace('click');
}
}
}
//update based on Casey Yee's answer
//the set dimension function needs to contain a fill to be clickable
//a fill is added with alpha set to zero to maintain transparency.
//utility function: set dimensions
this.utilities.setDimensions = function(what:*, width:uint, height:uint)
{
what.graphics.beginFill(0x000000, 0);
what.graphics.drawRect(0, 0, width, height);
what.graphics.endFill();
}
test is still a empty movie clip, try adding drawing something into it:
test = new MovieClip();
test.graphics.beginFill(0xFF0000);
test.graphics.drawRect(0,0,100,100);
test.graphics.endFill();

Resources