Create AVD for HTC Desire & Samsung Galaxy 3 on ADT v21.1 - adt

How do I create avd for HTC Desire and Samsung Galaxy 3 phones using ADT v21.1?
All advises that I found are about old ADT.
Thank you

Device: 3.7" WVGA (480x800: hdpi)
Target: Android 2.3.3 - API Level 10
CPU/ABI: ARM (armeabi)
Uncheck "Hardware keyboard present"
Uncheck "Display a skin with hardware controls"
Memory Options: RAM: 576
Memory Options: VM Heap: 24
Internal Storage: 150 MiB
SD Card: Size: 4 GiB
You may want to use a a smaller SD card, and an Intel Atom System Image Target or x86 CPU/ABI for better performance despite less accurate emulation. Note that you need to have downloaded the equivalent platforms and system images in the SDK manager before this.

Related

Android Emulator, graphical glitches on Windows 11

I have a Windows 11 machine, with a RTX 3050 graphics card. It's a Dell G15 laptop. I cannot find a (good) solution to the graphical glitches that appears on an Android Emulator.
The only "solution" I found was to change the hw.gpu.mode in the config.ini file from auto to guest. That fixes the glitches, but causes really bad performance issues and one app I developed with Flutter for my company straight up doesn't load. (loads when hw.gpu.mode=auto).
I'd appreciate if you can point me to the right direction to solving this. Let me know if you need any other details about my machine:
OS: Windows 11 Home Single Language [64-bit]
Kernel: 10.0.22000.0
CPU: 11th Gen Intel(R) Core(TM) i5-11260H # 2.60GHz
GPU: NVIDIA GeForce RTX 3050 Laptop GPU
Nvidia driver version: 516.94 (Downloaded the "Game ready driver" from GeForce Experience)
This problem is only on emulators with android 12+, personally i installed a device with android 11 and a second device with android13 for tests.
It seems like dedicated Nvidia GPU's are causing the problem. I have a 3060 laptop and I have the same issue and when I set it to guest it seems to work. My guess is that setting changes it from using the GPU to the CPU. I would recommend you try setting android studio to use integrated graphics instead of dedicated. Since I have a 8 core CPU compared to a 4 core CPU of yours, I'm guessing that's the reason I don't get as bad performance

Do not see camera in AVD

I am a beginner in mobile app development. I am setting up my dev IDE and AVD. I have created a new AVD with the following specs:
Target: Android 7
Platform 7.0
API Level: 24
CPU: Intel Atom x86
Device Nexus 6
Skin: Skin with dynamic hardware controls
Front Camera: Emulated
Back Camera: Webcam0
Memory Ram: 512, VM Heap: 32
Internal Storage: 200 MiB
Use Host GPU
I am using Intel HAXM and have disabled Hyper-V. My issue is that I do not see camera in the AVD. I have to develop an app that uses camera. Please help

Why is running android app so slow on windows 7, pentium n3540?

I installed android studio 2.2 and created emulator nexus 4.
My emulator's details are:
Name: Nexus_4_API_23
CPU/ABI: Google APIs Intel Atom (x86_64)
Path: D:\Programs\Android\.android\avd\Nexus_4_API_23.avd
Target: google_apis [Google APIs] (API level 23)
Skin: nexus_4
SD Card: D:\Programs\Android\.android\avd\Nexus_4_API_23.avd\sdcard.img
hw.dPad: no
runtime.network.speed: full
hw.accelerometer: yes
hw.device.name: Nexus 4
vm.heapSize: 64
skin.dynamic: yes
hw.device.manufacturer: Google
hw.gps: yes
hw.initialOrientation: Portrait
image.androidVersion.api: 23
hw.audioInput: yes
image.sysdir.1: system-images\android-23\google_apis\x86_64\
tag.id: google_apis
showDeviceFrame: yes
hw.camera.back: emulated
hw.mainKeys: no
AvdId: Nexus_4_API_23
hw.camera.front: emulated
hw.lcd.density: 320
avd.ini.displayname: Nexus 4 API 23
hw.gpu.mode: auto
hw.device.hash2: MD5:6930e145748b87e87d3f40cabd140a41
hw.ramSize: 512
hw.trackBall: no
hw.battery: yes
hw.cpu.ncore: 2
hw.sdCard: yes
tag.display: Google APIs
runtime.network.latency: none
hw.keyboard: yes
hw.sensors.proximity: yes
disk.dataPartition.size: 800M
hw.sensors.orientation: yes
avd.ini.encoding: UTF-8
hw.gpu.enabled: yes
When I run simple app it does gradle rebuild and launches it on emulator. And it takes near 2 minutes every time I run it even when emulator is on.
In console, run ends with these messages:
D/OpenGLRenderer: Use EGL_SWAP_BEHAVIOR_PRESERVED: true
I/OpenGLRenderer: Initialized EGL, version 1.4
D/gralloc_ranchu: Emulator without host-side GPU emulation detected.
Haxm is installed and working. Intel virtualization technology is enabled in bios.
What is wrong with my emulator options?
Why is my android studio so slow?
Solved it. In android studio go to Tools>Android>AVD Manager.
Click on edit button of your virtual device
Next find Graphics option and select Hardware - GLES 2.0.
Now running app when emulator is on is very quick.
And in console there is no message: Emulator without host-side GPU emulation detected.
The best android tutorial I've found is at https://www.udacity.com/course/new-android-fundamentals--ud851
There are more free android tutorials at https://www.udacity.com/courses/android
Two suggestions for best performance:
Install Intel HAXM:
https://software.intel.com/en-us/android/articles/intel-hardware-accelerated-execution-manager
Enable Host GPU:

Cannot see Nexus 5X from android studio on mac

I cannot see my Nexus 5X Android 6.0 on Android Studio Device Chooser on my Mac.
But the device is connected to the Mac,
macbook-pro-de-mincong:~ minconghuang$ system_profiler SPUSBDataType
USB:
USB 3.0 Bus:
Capacity: 63.3 MB (63,292,320 bytes)
...
Nexus 5X:
Product ID: ...
Vendor ID: ... (Google Inc.)
Version: 3.10
Serial Number: ...
Speed: Up to 480 Mb/sec
Manufacturer: LGE
Location ID: ... / 26
Current Available (mA): 1000
Current Required (mA): 500
Extra Operating Current (mA): 0
However, the android studio do work with another Android device running Android 4.4.4 API 19. Recognition within 1s. So where might the problem come from ? By the way, the Nexus 5X uses usb-c connector and I use an adaptor usb-c/use-a. I don't know whether it causes problem.
I don't know the mechanism, but when i restart my Mac, it works. (I didn't reboot since 3 weeks..)

Does Kinect for windows v2 work with Parallels?

Does Kinect for windows v2 work on mac pro using windows 8.1 that is running on top of Parallels?
Considering Kinects v2's minimum hardware requirements below (copied from this MSDN blogs), it is not possible for windows 8/8.1 running on top of Parallels to recognize and run Kinect v2. The latest version of parallels v10, as of the time of this answer, only supports DirectX 10 which is below the minimum requirement. I have tried it myself, but no success even with Parallels Gaming Mode. Moreover, in order for Kinect to be recognized you need the full USB 3.0 bandwidth.
Alternative solution as discussed inthis MSDN blog, is to use WindowsToGo or by installing Windows using boot camp.
Kinects v2 minimum required capabilities:
64 bit (x64) processor
4 GB Memory (or more)
I7 3.1Ghz (or higher)
Built-in USB 3.0 host controller (Intel or Renesas chipset).
If you’re adding USB 3.0 functionality to your existing PC through an adapter, please ensure that it is a Windows 8 compliant device and that it supports Gen-2. See the troubleshooting section of Getting Started for more information.
DX11 capable graphics adapter (see list of known good adapters below)
Intel HD 4400 integrated display adapter
ATI Radeon HD 5400 series
ATI Radeon HD 6570
ATI Radeon HD 7800 (256bit GDDR5 2GB/1000Mhz)
NVidia Quadro 600
NVidia GeForce GT 640
NVidia GeForce GTX 660
NVidia Quadro K1000M
A Kinect v2 sensor, which includes a power hub and USB cabling.

Resources