I am trying to map amplitude to a synth using a bus in order to smooth out the sine wave (remove high frequencies) based off of semi-random inputs from an outside source, but when I run this code, there is no sound.
s.boot;
(
SynthDef( \cello, {|rate = 440, amp = 0.2|
var signal;
signal = SinOsc.ar(rate, 0, amp);
Out.ar([0,1], signal)}
).add;)
h = Synth( \cello, [ \rate, 440, \amp, 0 ] );
c = Bus.control(s, 2);
c.scope;
Task({
var counter, pyAmp, newAmp, oldAmp = 0;
counter = 0;
{counter < 1}.while ({
pyAmp = 3.0.rand;
(pyAmp).postln;
d = { XLine.ar(oldAmp, pyAmp, 0.1) }.play(outbus: c);
("and").postln;
(oldAmp).postln;
oldAmp = pyAmp;
h.map(\amp, d);
0.1.wait;
})
}).play;
)
You have at least a couple of problems.
Your first XLine synth tries to do an XLine starting from 0. Absolute zero is a problem in exponential-land, it's impossible. Start from a tiny but positive value instead.
You are creating little XLine synths to try and set the amp, but you're never releasing these synths. Lots of them are building up. Who knows what value that amp is adding up to in the end? You should use doneActions to make the synths free themselves.
Thirdly (but not harmful) there's no point running those XLine synths at audio rate, you may as well use XLine.kr not XLine.ar
Related
I have perhaps two thousand SVG files, obtained as a result of scanning a number of brochures to greyscale JPG, then batch-binarising them and saving to monochrome TIFF using ScanTailor, then batch-vectorising the TIFFs (using the command-line utility ImageMagick and a quick FOR loop in Fish), and finally resizing/editing them by hand in Inkscape. You can see my workflow heavily favours command-line UNIX utilities; I'll use a graphical tool if necessary, but not for repetitive tasks. If it makes any difference at all, my preferred UNIX distribution is MacOS.
What is left, at the end of this process, is a ~750KB file containing essentially a complete mathematical description of the page. Every printed letter or stroke of the pen has its own path, and (because I didn't use any sort of despeckling algorithm) so does every meaningless artifact (although I made sure to clean up the edges of every page, for workflow reasons).
Most of the scans, though, were imperfect (300 ppi when 600 or 900 ppi would have been better); the binarisation algorithm (the Otsu method) wasn't perfect either, etc. All of the imperfections added up, so in most cases the paths are rather noisy. The path representing the printed capital letter H, for example, needs only eight nodes (corners), or sixteen if it has rounded terminals (ends). I'm willing to accept more than that, because after all the system isn't perfect, but when I see thirty nodes on the H, and it has scalloped sides under magnification, my eyes start to bleed.
I know this isn't anything to worry about when the pages (rendered as PNG) reach the print shop, because the Mark 1 Eyeball will smooth everything out, but I'm too much of a perfectionist to leave it like that.
To solve the problem, I tried selecting all paths in Inkscape with Cmd/A, then using the "simplify path" command by typing Cmd/L. What I expected was that Inkscape would smooth all the paths individually; what resulted was Inkscape smoothing everything collectively into one blurry mess.
I get the result I want if I select path number one and type Cmd/L, then path number two and again Cmd/L, but a representative page has over FOUR HUNDRED paths and this kind of workflow is essentially impracticable.
I know Inkscape has a (very badly documented) command-line mode, and there might perhaps be a script available to do what needs doing, but if it exists somewhere I can't find it. An ideal solution would be to do what I described above, but programmatically (shell script?), then a FOR loop to do it on every file in the directory.
The basic algorithm for path simplification is not that complicated, as long as you do not need to handle curves. So one avenue could be to write a script yourself. The following is an excerpt from a node.js script I have used to simplify polygons/polylines in maps. For geojson files in the size range of up to several MB it would run in 0.1-0.2 seconds on my (old) computer.
The idea is to take the first and last points of a polyline and to measure how far the second point is removed from a line connecting the two. If it is less than a threshold (the smallest deviation from a straight line you will be able to spot), it can be safely removed, and the next middle point can be examined. If not, the point is preserved and further examinations measure the distance from a line from that point to the last.
const sqEpsilon = ... // square (!) of the minimum distance to preserve
// takes a list points in the form [[x,y],...]
function simplifyDP (points) {
const len = points.length;
const markers = new Uint8Array(len);
markers[0] = markers[len - 1] = 1;
simplifyDPStep(points, markers, 0, len - 1);
return markers.reduce((pts, m, i) => {
if (m) pts.push(points[i]);
return pts;
}, []);
}
function simplifyDPStep (points, markers, first, last) {
let maxSqDist = 0, idx;
for (let i = first + 1; i <= last - 1; i++) {
const sqDist = sqDistance(points[i], points[first], points[last]);
if (sqDist > maxSqDist) {
idx = i;
maxSqDist = sqDist;
}
}
if (maxSqDist > sqEpsilon) {
markers[idx] = 1;
simplifyDPStep(points, markers, first, idx);
simplifyDPStep(points, markers, idx, last);
}
}
function sqDistance(p, p1, p2) {
let x = p1[0],
y = p1[1];
const dx = p2[0] - x,
dy = p2[1] - y;
const dot = dx * dx + dy * dy;
if (dot > 0) {
const t = ((p[0] - x) * dx + (p[1] - y) * dy) / dot;
if (t > 1) {
x = p2[0];
y = p2[1];
} else if (t > 0) {
x += dx * t;
y += dy * t;
}
}
const cdx = p[0] - x;
const cdy = p[1] - y;
return cdx * cdx + cdy * cdy;
}
Alright I think I've mostly figured out how the MagicTile works, the source code at least (not really the Math as much yet). It all begins with the build and render calls in the MainForm.cs. It generates a tessellation like this:
First, it "generates" the tessellation. Since MagicTile is a Rubic's cube-like game, I guess it just statically computes all of the tiles up front. It does this by starting with a central tile, and reflecting its polygon (and the polygon's segments and points) using some sort of math which I've read about several times but I couldn't explain. Then it appears they allow rotations of the tessellation, where they call code like this in the "renderer":
Polygon p = sticker.Poly.Clone();
p.Transform( m_mouseMotion.Isometry );
Color color = GetStickerColor( sticker );
GLUtils.DrawConcavePolygon( p, color, GrabModelTransform() );
They track the mouse position, like if you are dragging, and somehow that is used to create an "isometry" to augment / transform the overall tessellation. So then we transform the polygon using that isometry. _It appears they only do the central tile and 1 or 2 levels after that, but I can't quite tell, I haven't gotten the app to run and debug yet (it's also in C# and that's a new language for me, coming from TypeScript). The Transform function digs down like this (here it is in TypeScript, as I've been converting it):
TransformIsometry(isometry: Isometry) {
for (let s of this.Segments) {
s.TransformIsometry(isometry)
}
this.Center = isometry.Apply(this.Center)
}
That goes into the transform for the segments here:
/// <summary>
/// Apply a transform to us.
/// </summary>
TransformInternal<T extends ITransform>(transform: T) {
// NOTES:
// Arcs can go to lines, and lines to arcs.
// Rotations may reverse arc directions as well.
// Arc centers can't be transformed directly.
// NOTE: We must calc this before altering the endpoints.
let mid: Vector3D = this.Midpoint
if (UtilsInfinity.IsInfiniteVector3D(mid)) {
mid = this.P2.MultiplyWithNumber(UtilsInfinity.FiniteScale)
}
mid = UtilsInfinity.IsInfiniteVector3D(this.P1)
? this.P2.MultiplyWithNumber(UtilsInfinity.FiniteScale)
: this.P1.MultiplyWithNumber(UtilsInfinity.FiniteScale)
this.P1 = transform.ApplyVector3D(this.P1)
this.P2 = transform.ApplyVector3D(this.P2)
mid = transform.ApplyVector3D(mid)
// Can we make a circle out of the transformed points?
let temp: Circle = new Circle()
if (
!UtilsInfinity.IsInfiniteVector3D(this.P1) &&
!UtilsInfinity.IsInfiniteVector3D(this.P2) &&
!UtilsInfinity.IsInfiniteVector3D(mid) &&
temp.From3Points(this.P1, mid, this.P2)
) {
this.Type = SegmentType.Arc
this.Center = temp.Center
// Work out the orientation of the arc.
let t1: Vector3D = this.P1.Subtract(this.Center)
let t2: Vector3D = mid.Subtract(this.Center)
let t3: Vector3D = this.P2.Subtract(this.Center)
let a1: number = Euclidean2D.AngleToCounterClock(t2, t1)
let a2: number = Euclidean2D.AngleToCounterClock(t3, t1)
this.Clockwise = a2 > a1
} else {
// The circle construction fails if the points
// are colinear (if the arc has been transformed into a line).
this.Type = SegmentType.Line
// XXX - need to do something about this.
// Turn into 2 segments?
// if( UtilsInfinity.IsInfiniteVector3D( mid ) )
// Actually the check should just be whether mid is between p1 and p2.
}
}
So as far as I can tell, this will adjust the segments based on the mouse position, somehow. Mouse position isometry updating code is here.
So it appears they don't have the functionality to "move" the tiling, like if you were walking on it, like in HyperRogue.
So after having studied this code for a few days, I am not sure how to move or walk along the tiles, moving the outer tiles toward the center, like you're a giant walking on Earth.
First small question, can you do this with MagicTile? Can you somehow update the tessellation to move a different tile to the center? (And have a function which I could plug a tween/animation into so it animates there). Or do I need to write some custom new code? If so, what do I need to do roughly speaking, maybe some pseudocode?
What I imagine is, user clicks on the outer part of the tessellation. We convert that click data to the tile index in the tessellation, then basically want to do tiling.moveToCenter(tile), but frame-by-frame-animation, so not quite sure how that would work. But that moveToCenter, what would that do in terms of the MagicTile rendering/tile-generating code?
As I described in the beginning, it first generates the full tessellation, then only updates 1-3 layers of the tiles for their puzzles. So it's like I need to first shift the frame of reference, and recompute all the potential visible tiles, somehow not recreating the ones that were already created. I don't quite see how that would work, do you? Once the tiles are recomputed, then I just re-render and it should show the updated center.
Is it a simple matter of calling some code like this again, for each tile, where the isometry is somehow updated with a border-ish position on the tessellation?
Polygon p = sticker.Poly.Clone();
p.Transform( m_mouseMotion.Isometry );
Or must I do something else? I can't quite see the full picture yet.
Or is that what these 3 functions are doing! TypeScript port of the C# MagicTile:
// Move from a point p1 -> p2 along a geodesic.
// Also somewhat from Don.
Geodesic(g: Geometry, p1: Complex, p2: Complex) {
let t: Mobius = Mobius.construct()
t.Isometry(g, 0, p1.Negate())
let p2t: Complex = t.ApplyComplex(p2)
let m2: Mobius = Mobius.construct()
let m1: Mobius = Mobius.construct()
m1.Isometry(g, 0, p1.Negate())
m2.Isometry(g, 0, p2t)
let m3: Mobius = m1.Inverse()
this.Merge(m3.Multiply(m2.Multiply(m1)))
}
Hyperbolic(g: Geometry, fixedPlus: Complex, scale: number) {
// To the origin.
let m1: Mobius = Mobius.construct()
m1.Isometry(g, 0, fixedPlus.Negate())
// Scale.
let m2: Mobius = Mobius.construct()
m2.A = new Complex(scale, 0)
m2.C = new Complex(0, 0)
m2.B = new Complex(0, 0)
m2.D = new Complex(1, 0)
// Back.
// Mobius m3 = m1.Inverse(); // Doesn't work well if fixedPlus is on disk boundary.
let m3: Mobius = Mobius.construct()
m3.Isometry(g, 0, fixedPlus)
// Compose them (multiply in reverse order).
this.Merge(m3.Multiply(m2.Multiply(m1)))
}
// Allow a hyperbolic transformation using an absolute offset.
// offset is specified in the respective geometry.
Hyperbolic2(
g: Geometry,
fixedPlus: Complex,
point: Complex,
offset: number,
) {
// To the origin.
let m: Mobius = Mobius.construct()
m.Isometry(g, 0, fixedPlus.Negate())
let eRadius: number = m.ApplyComplex(point).Magnitude
let scale: number = 1
switch (g) {
case Geometry.Spherical:
let sRadius: number = Spherical2D.e2sNorm(eRadius)
sRadius = sRadius + offset
scale = Spherical2D.s2eNorm(sRadius) / eRadius
break
case Geometry.Euclidean:
scale = (eRadius + offset) / eRadius
break
case Geometry.Hyperbolic:
let hRadius: number = DonHatch.e2hNorm(eRadius)
hRadius = hRadius + offset
scale = DonHatch.h2eNorm(hRadius) / eRadius
break
default:
break
}
this.Hyperbolic(g, fixedPlus, scale)
}
So I am working with p5.js for class and I am very lost with it, as I dont understand very well. How do I animate this image to match with the sound? I tried frequency analysis but i dont know how to apply to the image. I wanted to animate it as i it was beating, like a heart, but according to the bpm sound i put in the sketch.
here is the sketch + image + sound
https://editor.p5js.org/FilipaRita/sketches/cUG6qNhIR
Actually finding the BMP for an entire piece of music would be a bit complicated (see this sound.stackexchange.com question), but if you just want to detect beats in real time I think you can probably hack something together that will work. Here is a visualization that I think will help you understand the data returned by fft.analyze():
const avgWindow = 20;
const threshold = 0.4;
let song;
let fft;
let beat;
let lastPeak;
function preload() {
song = loadSound("https://www.paulwheeler.us/files/metronome.wav");
}
function setup() {
createCanvas(400, 400);
fft = new p5.FFT();
song.loop();
beat = millis();
}
function draw() {
// Pulse white on the beat, then fade out with an inverse cube curve
background(map(1 / pow((millis() - beat) / 1000 + 1, 3), 1, 0, 255, 100));
drawSpectrumGraph(0, 0, width, height);
}
let i = 0;
// Graphing code adapted from https://jankozeluh.g6.cz/index.html by Jan Koželuh
function drawSpectrumGraph(left, top, w, h) {
let spectrum = fft.analyze();
stroke('limegreen');
fill('darkgreen');
strokeWeight(1);
beginShape();
vertex(left, top + h);
let peak = 0;
// compute a running average of values to avoid very
// localized energy from triggering a beat.
let runningAvg = 0;
for (let i = 0; i < spectrum.length; i++) {
vertex(
//left + map(i, 0, spectrum.length, 0, w),
// Distribute the spectrum values on a logarithmic scale
// We do this because as you go higher in the spectrum
// the same perceptible difference in tone requires a
// much larger chang in frequency.
left + map(log(i), 0, log(spectrum.length), 0, w),
// Spectrum values range from 0 to 255
top + map(spectrum[i], 0, 255, h, 0)
);
runningAvg += spectrum[i] / avgWindow;
if (i >= avgWindow) {
runningAvg -= spectrum[i] / avgWindow;
}
if (runningAvg > peak) {
peak = runningAvg;
}
}
// any time there is a sudden increase in peak energy, call that a beat
if (peak > lastPeak * (1 + threshold)) {
// print(`tick ${++i}`);
beat = millis();
}
lastPeak = peak;
vertex(left + w, top + h);
endShape(CLOSE);
// this is the range of frequencies covered by the FFT
let nyquist = 22050;
// get the centroid (value in hz)
let centroid = fft.getCentroid();
// the mean_freq_index calculation is for the display.
// centroid frequency / hz per bucket
let mean_freq_index = centroid / (nyquist / spectrum.length);
stroke('red');
// convert index to x value using a logarithmic x axis
let cx = map(log(mean_freq_index), 0, log(spectrum.length), 0, width);
line(cx, 0, cx, h);
}
<script src="https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.3.1/p5.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.3.1/addons/p5.sound.min.js"></script>
Hopefully this code with the comments helps you understand the data returned by fft.analyze() and you can use this as a starting point to achieve the effect you are looking for.
Disclaimer: I have experience with p5.js but I'm not an audio expert, so there could certainly be better ways to do this. Also while this approach works for this simple audio file there's a good chance it would fail horribly for actual music or real world environments.
If I were you then I would cheat and add some meta data that explicitly includes the timestamps of the beats. This would be a much simpler problem if you could shift the problem of beat detection to pre-processing. Maybe even do it by hand. Rather than trying to do it at runtime. The signal processing to do beat detection in an audio signal is non-trivial.
So i've been stuck for a while because i've been having trouble dynamically changing the shape of the vertices in a place geometry according to the frequency data of an mp3, I've been having 2 main problems:
1)The array generated by the mp3 has too many values and it is impossible to render out the vertices that fast and accordingly, i am getting the frequency data with this code.
var frequencyData = new Uint8Array(analyser.frequencyBinCount);
2) Re-Rendering the plane everytime frequencyData changes causes extreme performance issues to the point it does not render out anymore
I've been using simplex noise to cause the vertices to morph, and it does work until obviously i pass in frequency data and everything breaks, this is the code i'm trying to use to morph the vertices of the plane according to the music.
function adjustVertices() {
for (var i = 0; i < 100; i++) {
for (var j = 0; j < 100; j++) {
var ex = 0.5;
pgeom.vertices[i + j * 100].z =
(noise.simplex2(i / 100, j / 100) +
noise.simplex2((i + 500) / 50, j / 50) * Math.pow(ex, frequencyData[2]) +
noise.simplex2((i + 400) / 25, j / 25) * Math.pow(ex, frequencyData[2]) +
noise.simplex2((i + 600) / 12.5, j / 12.5) * Math.pow(ex, frequencyData[2]) +
+(noise.simplex2((i + 800) / 6.25, j / 6.25) * Math.pow(ex, frequencyData[2]))) /
2;
pgeom.verticesNeedUpdate = true;
pgeom.computeVertexNormals();
}
}
}
This is my plane object:
var pgeom = new THREE.PlaneGeometry(5, 5, 99, 99);
var plane = THREE.SceneUtils.createMultiMaterialObject(pgeom, [
new THREE.MeshPhongMaterial({
color: 0x33ff33,
specular: 0x773300,
side: THREE.DoubleSide,
shading: THREE.FlatShading,
shininess: 3,
}),
]);
scene.add(plane);
I am very grateful for the help, I am just doing my best in mastering three.js :)
I would check if the computeVertexNormals is what is taking the most time in that render loop, and then look into optimizing it, if you still require it.
You can optimize the normal calculation by building the mesh topology once at startup, since it doesn't change at runtime, making the recalc run in constant time.
Then reduce the vertex count until things become manageable. :)
The first answer is correct. Most likely computing vertex normals is causing the hit, and it's most likely happening because the Geometry method which you seem to be using creates a lot of new THREE.Vector3. If you profile this i imagine you'd see a lot of GC activity and not so much of computation time.
One more thing to consider since you only map one variable, is to move this computation in the shader. You could write your values to a texture and only update that. You would not have to refresh the vertex and normal buffers which are much larger than the texture you'd need to store just the input variable. You would also be able to do this computation in parallel.
I need to control the sound play speed so I extract sample data from the sound file but how can I control the volume then as SoundTranform.volume has no effect?
private function onSampleData(event:SampleDataEvent):void
{
var l:Number;
var r:Number;
var outputLength:int = 0;
while (outputLength < 2048)
{
_loadedMP3Samples.position = int(_phase) * 8; // 4 bytes per float and two channels so the actual position in the ByteArray is a factor of 8 bigger than the phase
l = _loadedMP3Samples.readFloat(); // read out the left and right channels at this position
r = _loadedMP3Samples.readFloat(); // read out the left and right channels at this position
event.data.writeFloat(l); // write the samples to our output buffer
event.data.writeFloat(r); // write the samples to our output buffer
outputLength++;
_phase += _playbackSpeed;
if (_phase < 0)
_phase += _numSamples;
else if (_phase >= _numSamples)
_phase -= _numSamples;
}
}
Volume:
use say var volume: Number = 1.0 as a field variable. 0.0 for mute, 1.0 for original volume. Alter in other methods. However tweening this variable will be appreciated by listeners.
event.data.writeFloat(volume * l);
event.data.writeFloat(volume * r);
Speed:
You have to resample and use interpolation to define the intermediate values.
It's mathematically involved, but I'm sure there's a ton of libraries that can do this for you. But hey, here's a tutorial that tells you how apparently:
http://www.kelvinluck.com/2008/11/first-steps-with-flash-10-audio-programming/
Edit: Oh, You used this tutorial... You could have said.
modify _playbackSpeed. 1.0 is full speed. 2.0 is double speed.