Double download with WebClient.OpenReadAsync - windows-phone-7

Ok, in my app I need to download two lists of datas to elaborate them but I can't realize how to do it..
I click a button and then I think the downloads start almost together. This is good for me, what it's not good is that my application can't understand how to wait the downloads before doing anything else..
I know there's a design problem but I cannot figure out how to resolve it..
The code is something (more or less) like this:
private void button_Click(object sender, RoutedEventArgs e)
{
try
{
WebClient webClient = new WebClient();
Uri uri = new Uri("http://myRESTservice");
webClient.OpenReadCompleted += new OpenReadCompletedEventHandler(webClient_OpenReadCompleted);
webClient.OpenReadAsync(uri); //this will set a private variableA
dwnl();
doSomething(); //this will do something with A and B
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
private void dwnl()
{
try
{
WebClient webClient = new WebClient();
Uri uri = new Uri("http://myRESTservice/anotherAddress");
webClient.OpenReadCompleted += new OpenReadCompletedEventHandler(webClient_OpenReadCompleted_B);
webClient.OpenReadAsync(uri); //this will set a private variableB
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
Hope you understand the problem..

While your application is downloading the data, i.e. the OpenReadAsync method has been called you could show a busy indication. Your doSomething method would then be called from within your OpenReadCompleted event handler.
If you want one download to occur after the other has completed then you could also call the dwnl method from within your OpenReadCompleted event handler.

Related

Turning Bluetooth Tethering On in Xamarin.Android

I'm currently trying to add some Bluetooth functionality to my app. I want to be able to change the Bluetooth Tethering on or off, as well as check its status.
I found the Java code on StackOverflow: How to check Bluetooth tethering status programmatically in Android
I have translated it into C#, but I don't seem to be able to get any result.
Regardless of the tethering setting, it always shows the toast with "Tethering:false", and the setBluetoothTethering doesn't change anything.
Any idea what I'm missing?
Here's my code:
[...]
try
{
Class classBluetoothPan = Class.ForName("android.bluetooth.BluetoothPan");
Method mBTPanConnect = classBluetoothPan.GetDeclaredMethod("connect", Class.FromType(typeof(BluetoothDevice)));
Constructor BTPanCtor = classBluetoothPan.GetDeclaredConstructor(Class.FromType(typeof(Context)), Class.FromType(typeof(IBluetoothProfileServiceListener)));
BTPanCtor.Accessible = true;
Java.Lang.Object BTSrvInstance = BTPanCtor.NewInstance(Activity, new BTPanServiceListener(Activity));
Method isTetheringOnMethod = classBluetoothPan.GetDeclaredMethod("isTetheringOn", null);
var isTetheringOn = isTetheringOnMethod.Invoke(BTSrvInstance);
Toast.MakeText(Activity, "Tethering:" + isTetheringOn, ToastLength.Short).Show();
Method setBluetoothTetheringMethod = classBluetoothPan.GetDeclaredMethod("setBluetoothTethering", new Class[1] { Class.FromType(typeof(bool)) });
setBluetoothTetheringMethod.Invoke(BTSrvInstance, true);
// tether = !tether;
}
catch (ClassNotFoundException e)
{
e.PrintStackTrace();
}
catch (Java.Lang.Exception e)
{
e.PrintStackTrace();
}
[...]
public class BTPanServiceListener : Java.Lang.Object, IBluetoothProfileServiceListener
{
private Activity _activity;
public BTPanServiceListener(Activity activity)
{
_activity = activity;
}
public void OnServiceConnected([GeneratedEnum] ProfileType profile, IBluetoothProfile proxy)
{
// throw new NotImplementedException();
}
public void OnServiceDisconnected([GeneratedEnum] ProfileType profile)
{
// throw new NotImplementedException();
}
}
I figured out how to enable Bluetooth tethering via setBluetoothTethering.
I wrote an entire blog about this
You can find the final code here
I assume that isTetheringOn works in the same way

When does MessageWebSocket receive data?

Info: Despite using the WebSocket tag, I am using MessageWebSocket in my code because I am coding on UWP.
MessageWebSocket has an event called MessageReceived. I added an TypedEventHandler to this event when initializing the MessageWebSocket:
messageWebSocket.MessageReceived += new TypedEventHandler<MessageWebSocket, MessageWebSocketMessageReceivedEventArgs>(OnMessageRecieved);
After sending data with a method called SendData(), I expected that the MessageReceived event is fired. But it won't and I don't know why.
This is my SendData() method:
private async void SendData(DataWriter dataWriter)
{
try
{
_evaLogger.Info("Trying to send data...");
IBuffer buffer = dataWriter.DetachBuffer();
await messageWebSocket.OutputStream.WriteAsync(buffer);
_evaLogger.Info("Data was sent");
}
catch (Exception e)
{
_evaLogger.Error(e.Message, e);
}
}
If not after sending data, when does MessageWebSocket receive data?
I was able to figure out what the problem was. The server was expecting to get a text, instead I send data to the server. This is how the solution for sending text looks like:
private async Task SendData(DataWriter dataWriter)
{
try
{
_evaLogger.Info("Trying to send data...");
await dataWriter.StoreAsync();
_evaLogger.Info("Data was sent");
}
catch (Exception e)
{
_evaLogger.Error(e.Message, e);
}
}
It's also important to set the the MessageType to Utf8:
messageWebsocket.Control.MessageType = SocketMessageType.Utf8;

kitkat is not letting me upload images

I currently am using the default "file choose/image picker" in android to select the image i want to upload to my server. But the file chooser is not working with android kitkat. When ever I choose an image using the file chooser the URI or the local address to my image is returned as NULL. My code works perfectly with other android devices starting from android 2.2/2.3 to 4.2/4.3.
What I would love to know is if there is a way around it or is there a custom file chooser or a script that i should be using?
Any help is appreciated since this is my first time on stackoverflow. Thank you
never mind I found it.
Bitmap bitmap;
private static final int READ_REQUEST_CODE = 42;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
requestWindowFeature(Window.FEATURE_NO_TITLE);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.activity_main);
Intent intent = new Intent(Intent.ACTION_OPEN_DOCUMENT);
// Filter to only show results that can be "opened", such as a
// file (as opposed to a list of contacts or timezones)
intent.addCategory(Intent.CATEGORY_OPENABLE);
// Filter to show only images, using the image MIME data type.
// If one wanted to search for ogg vorbis files, the type would be "audio/ogg".
// To search for all documents available via installed storage providers,
// it would be "*/*".
intent.setType("image/*");
startActivityForResult(intent, READ_REQUEST_CODE);
}
#Override
public void onActivityResult(int requestCode, int resultCode,
Intent resultData) {
// The ACTION_OPEN_DOCUMENT intent was sent with the request code
// READ_REQUEST_CODE. If the request code seen here doesn't match, it's the
// response to some other intent, and the code below shouldn't run at all.
if (requestCode == READ_REQUEST_CODE && resultCode == Activity.RESULT_OK) {
// The document selected by the user won't be returned in the intent.
// Instead, a URI to that document will be contained in the return intent
// provided to this method as a parameter.
// Pull that URI using resultData.getData().
Uri uri = null;
if (resultData != null) {
uri = resultData.getData();
try {
bitmap = MediaStore.Images.Media.getBitmap(this.getContentResolver(),uri);
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
ImageView my_img_view = (ImageView ) findViewById (R.id.uploadlayout2);
my_img_view.setImageBitmap(bitmap);
}
}
}
this worked for me.
Just remove the last lines about declaring a string and toasting the uri. And you are good to go.

Player crashes in application

all. I am trying to develop an application for Windows Phone 7 using Visual Studio 2010. It is a music player that is supposed to be able to play music based on the current event.
I managed to extract the event but when I tried to combine it with the player, the entire player would just crash. Here are the codes.
void Appointments_SearchCompleted(object sender, AppointmentsSearchEventArgs e)
{
try
{
AppointmentResultsDataLINQ.DataContext =
from Appointment appt in e.Results
where appt.IsAllDayEvent == false
select appt;
}
catch (System.Exception)
{
//No results
}
}
private void button2_Click(object sender, RoutedEventArgs e)
{
if ((((Appointment)(AppointmentResultsDataLINQ.DataContext)).Subject).Equals("Meeting"))
{
mediaElement1.Source = new Uri("http://www.opendrive.com/files/NV8zNTMwNDYwX2hxRXZR/Crystallize.mp3", UriKind.Absolute);
}
else
{
mediaElement1.Source = new Uri("https://www.opendrive.com/files/NV8zMjAxODY0X0VBNDJY/Hetken%20tie%20on%20kevyt%20(piano%20cover)%20-%20YouTube.mp3", UriKind.Absolute);
}
mediaElement1.Play();
}
The problem is the cast. You are trying to cast the AppointmentResultsDataLINQ.DataContext to an Appointment. This does not make sense. You need to select one concrete appointment from using LINQ (similar to the code in your Appointments_SearchCompleted that imho does nothing)

GLSurfaceView.Renderer crashes when resuming because "bitmap is recycled"

once again I need some help:
yesterday I asked this question that was about the way to use a large jpg image as a Bitmap (http://stackoverflow.com/questions/13511657/problems-with-big-drawable-jpg-image) and I resolved myself (Is my own response on that question) but whenever I resume my activity, as it uses that bitmap as the GLRenderer texture it crashes. I've tried many things, the last try was to make that bitmap static in order to keep it as a member variable into the activity but it crashes because, I supose, it looses it's mBuffer.
More details on the Activity code:
I declared it as SingletonInstance into the manifest:
android:launchMode="singleInstance"
in order to keep the tiles for the renderer.
and here some code:
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
mGLSurfaceView = new GLSurfaceView(this);
mGLSurfaceView.setEGLConfigChooser(true);
mSimpleRenderer = new GLRenderer(this);
getTextures();
if (!mIsTileMapInitialized){
tileMap = new LandSquareGrid(1, 1, mHeightmap, mLightmap, false, true, true, 128, true);
tileMap.setupSkybox(mSkyboxBitmap, true);
mIsTileMapInitialized = true;
}
initializeRenderer();
mGLSurfaceView.setRenderer(mSimpleRenderer);
setContentView( R.layout.game_layout );
setOnTouchListener();
initializeGestureDetector();
myCompassView = (MyCompassView)findViewById(R.id.mycompassview);
// Once set the content view we can set the TextViews:
coordinatesText = (TextView) findViewById(R.id.coordDynamicText);
altitudeText = (TextView) findViewById(R.id.altDynamicText);
directionText = (TextView) findViewById(R.id.dirDynamicText);
//if (!mIsGLInitialized){
mOpenGLLayout = (LinearLayout)findViewById(R.id.openGLLayout);
mOpenGLLayout.addView(mGLSurfaceView);
mVirtual3DMap = new Virtual3DMap(mSimpleRenderer, tileMap);
if (mGameThread == null){
mGameThread = new Thread(mVirtual3DMap);
mGameThread.start();
}
}
On getTextures method I get few small textures and the largest one as in my last question self response:
if (mTerrainBitmap==null){
InputStream is = getResources().openRawResource(R.drawable.terrain);
try {
// Set terrain bitmap options to 16-bit, 565 format.
terrainBitmapOptions.inPreferredConfig = Bitmap.Config.RGB_565;
Bitmap auxBitmap = BitmapFactory.decodeStream(is, null, terrainBitmapOptions);
mTerrainBitmap = Bitmap.createBitmap(auxBitmap);
}
catch (Exception e){
}
finally {
try {
is.close();
}
catch (IOException e) {
// Ignore.
}
}
}
So, again, first time it works great but when I go back I do:
protected void onPause() {
super.onPause();
mGLSurfaceView.onPause();
}
#Override
protected void onStop() {
// TODO Auto-generated method stub
super.onStop();
if (mVirtual3DMap != null) {
try {
mVirtual3DMap.cancel();
mGameThread=null;
mVirtual3DMap = null;
mGLSurfaceView.destroyDrawingCache();
mSimpleRenderer=null;
System.gc();
} catch (Throwable e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
And whan I resume the activity:
#Override
protected void onResume() {
super.onResume();
mGLSurfaceView.onResume();
if (mVirtual3DMap != null) {
try {
mVirtual3DMap.resume();
} catch (Throwable e) {
e.printStackTrace();
}
}
}
And it crashes.
Why?? Ok, here is the exception cause on the GLThread:
java.lang.IllegalArgumentException: bitmap is recycled...
I tried this messy stuff because launching more than two times the original activity the application crashes bacuse of this or because of the amount of memory used and now I don't know if revert all these changes or what todo with this.
Is there a good way to keep in memory and usable, by this or another application activity, this bitmap?
Please, I need your advices.
Thanks in advance.
Do not handle resources manually or your app's surface will broke up. You can't handle your resources manually.
If you worry about reloading resources and you use API level 11+, you can use setPreserveEGLContextOnPause(). It will perserve your textures and FBOs.
If you can't use API 11+, you can port GLSurfaceView() to your app. You can check my own GLSurfaceView that is ported from ICS.
PS: Sorry about my poor english.
No. Let Android handle all the resources. You must handle the appropriate events and reload the bitmap when the activity is resumed. You cannot expect, that any OpenGL handles are still valid after the activity has been resumed.
Think of it as in the example of a laptop coming out from hibernation. Although all memory has been restored, you cannot expect that any open socket has still a real active connection going.
I am an Android noobie, so please correct me if I am wrong.

Resources