Magic in Windows Phone isolated storage performance - windows-phone-7

I have merged small files into big one. On app first start this file is read and one by one small files are created on file system (Isolated Storage).
When this file contains 44 small files and is ~200kb - algorithm works for 120ms on device.
When this file contains 140 even smaller files and is ~400kb - algorithm works for 3000 ms on device.
If i take from both files only 44 files - first one still works for ~120, second works for ~800ms.
This seems as wonder for me.
Format of data in file is simple
-INT32 - ENTRIES COUNT
--STRING ENTRY NAME |
--INT32 ENTRY DATA LENGTH | REPEATS {ENTRY COUNT} TIMES
--BYTE[] ENTRY DATA |
For me this seems like a magic in Windows Phone IsolatedStorage mechanisms.
There are completely no reasons for second file to work 7-8 times slower when copying equal number of entries.
Repro project - https://www.dropbox.com/s/6bjsve7p8wew3kb/IsoStorageWonder.zip?m
Code:
public static void CopyCache(ILogger logger)
{
using (var isoStorage = IsolatedStorageFile.GetUserStoreForApplication())
{
var streamInfo = Application.GetResourceStream(new Uri(_dataFilePath, UriKind.RelativeOrAbsolute));
isoStorage.CreateDirectory("HttpCache");
var binaryReader = new BinaryReader(streamInfo.Stream);
{
int itemsCount = binaryReader.ReadInt32();
for (int i = 0; i < ENTRIES_COUNT; i++)
{
string fileName = binaryReader.ReadString();
int length = binaryReader.ReadInt32();
byte[] data = binaryReader.ReadBytes(length);
using (
var fileStream =
new IsolatedStorageFileStream(
Path.Combine(_rootCacheDir, fileName),
FileMode.Create,
FileAccess.Write,
FileShare.None,
isoStorage))
{
fileStream.Write(data, 0, data.Length);
}
}
}
}
}
MAGIC!

I have similar problem with WebClient performance. In emulator request takes 0.3-0.5 seconds, on device 8-22 seconds. I was very confused. But in my case the solution was very simple: DO NOT TEST PERFORMANCE ON DEVISE IN DEBUG MODE. What I do:
Compile project to your device.
Stop Debugging
Close your app on phone (and better reboot device)
All works like a charm))
In your test app IsoStorageWonder:
Emulator 551ms
Emulator 256 mB 564ms
HTC Radar WP7.8 Debug Mode 1835ms
HTC Radar WP7.8 Not Debug Mode 958ms
I hope my reserch help you. Regards
UPD
Test with output2
Emulator 440ms
Emulator 256 mB 447ms
HTC Radar WP7.8 Debug Mode 287ms // very nice
HTC Radar WP7.8 Not Debug Mode 144ms // also nice

Related

winapi bypass Windows high contrast limit fps

I'm using Windows 10 Professional high contrast mode for working at night.
I'm working in a Cocos2d-x 3.15 project, which using these following Win32 API:
UINT TARGET_RESOLUTION = 1; // 1 millisecond target resolution
TIMECAPS tc;
UINT wTimerRes = 0;
if (TIMERR_NOERROR == timeGetDevCaps(&tc, sizeof(TIMECAPS)))
{
wTimerRes = std::min(std::max(tc.wPeriodMin, TARGET_RESOLUTION), tc.wPeriodMax);
timeBeginPeriod(wTimerRes);// request minimum time resolution to 1
}
It's not truly these following code, just an example.
while(true)//main loop
{
mainGameLoop();
int workedTime= calculateWorkedTime();// miliseconds
if(workedTime < 1000/60)// target 60fps
{
int remainingTime = 1000/60 - workedTime;
Sleep(remainingTime);
}
}
In normal state, my game will run at 60fps on Windows (nearly 60fps). But in high contrast, the game is limited to 30fps (feel very annoying when testing games).
I played many games that have 60fps fine. Can I do some trick in my project to bypass the 30fps limit (using Win32 API)?
p/s: This project uses OpenGL.

Geolocator and accuracy in Windows Phone 8

I have a few questions about Geolocator and property DesiredAccuracy.
I have the method GetMyPosition:
public async Task<Geoposition> GetMyPosition()
{
Geoposition myGeoposition = null;
Geolocator myGeolocator = new Geolocator();
myGeolocator.DesiredAccuracy = PositionAccuracy.High;
try
{
myGeoposition = await myGeolocator.GetGeopositionAsync();
return myGeoposition;
}
catch (Exception ex)
{
Deployment.Current.Dispatcher.BeginInvoke(() =>
{
MessageBox.Show("Can't get the position");
});
return null;
}
}
1) Why
Geolocator.DesiredAccuracy = PositionAccuracy.High;
Geolocator.GetGeopositionAsync();
always return Geoposition.Coordinate.PositionSource = Cellular with accuracy 400 - 1600 m (on device Nokia Lumia 520)?
2) Under what settings I can get a high accuracy (50 - 100 m) and PositionSource = Satellite?
3) If I have the loaded maps on my device and I activated the airplane mode on the device, then code
Geolocator myGeolocator = new Geolocator();
myGeolocator.DesiredAccuracy = PositionAccuracy.High;
try
{
myGeoposition = await myGeolocator.GetGeopositionAsync();
return myGeoposition;
}
will work? Without a celluar, only a satellite?
4) How strong is the precision of coordinates depends on the device?
Thanks in advance!
Taken from MSDN
Although the Location Service uses multiple sources of location information, and any of the sources may not be available at any given time (for example, no GPS satellites or cell phone towers may be accessible), the native code layer handles the work of evaluating the available data and choosing the best set of sources. All your application needs to do is to choose between high accuracy or the default, power-optimized setting. You can set this value when you initialize the main Location Service class, GeoCoordinateWatcher.
C#
GeoCoordinateWatcher watcher = new GeoCoordinateWatcher(GeoPositionAccuracy.Default);
So it seems like you can't control which source is used but rather the available source will be used based on the specified position accuracy on GeoCoordinateWatcher. Try initializing a GeoCoordinateWatcher with high accuracy and see what happens
var geoWatcher = new GeoCoordinateWatcher(GeoPositionAccuracy.High);
You can use
Geolocator myGeolocator = new Geolocator();
myGeolocator.DesiredAccuracyInMeters = 20;
...
to explicitly state how accurate you want the location to be which would allow the device to manage its power a little better but whether you get close to that accuracy with your result depends on the quality of the location the device can get. If you're inside a building for example you're not going to get something that accurate without connecting to WIFI

iTextSharp PDFWriter bottleneck

So I'm taking 10000 2 page pdf files and merging them into one with iTextSharp.
This is some loose code of what I'm doing:
Document document = new Document();
using(PdfWriter writer = PdfWriter.GetInstance(document, new FileStream("merged.pdf", FileMode.Create)))
{
PdfContentByte cb = writer.DirectContent;
PdfReader reader = null;
foreach(string thisFile in files)
{
reader = new PdfReader(thisFile);
var page1 = writer.GetImportedPage(reader, 1);
var page2 = writer.GetImportedPage(reader, 2);
cb.AddTemplate(page1, 1f, 0, 0, 1f, 0, 0);
cb.AddTemplate(page2, 1f, 0, 0, 1f, 0, 0);
}
}
I'm trying to understand where the bottlenecks could be in two places. I ran some performance tests and the slowest processes are naturally reading in each file with PdfReader and the dispose that's saving the file, its called from the using PdfWriter block.
I'm getting about 25% utilization on all 16 cores for this process. I tried a solid state drive instead of my SATA 7.2k rpm drive and it's almost the exact same speed.
How can I speed this process up? There's no distributing the task because the read speed between computers would be even slower. Even if it means changing to another language,library or writing this lower level, I need to get this process done much faster than I currently am. Right now it takes about 10 minutes for the merge.
So I finally solved this. Here are my performance results with code of the winning approach below:
I used the same machine on all three of theses tests
iTextSharp - content builder directly on a pdfwriter
Windows 2008 64 bit
NTFS partition
merges about 30 pages per second during processing
significant overhead at the end when closing out the pdfwriter
25 pages per second over all
iTextSharp - PDFCopy
Windows 2008 64 bit
NTFS partition
writes the output to disk instead of memory so no overhead at the end
40 pages per second
iText (java) - PDFCopy (exact same code, just ported to java)
Ubuntu 12.04 64 bit server edition
EXT3 partition (going to try ext4 soon)
also writes the output to disk during processing
250 pages per second
Haven't tried to figure out why the same code runs faster in java on Ubuntu but I'll take it. In general I defined all major variables outside of this function since it gets called 36000 times during this process.
public void addPage(String inputPdf, String barcodeText, String pageTitle)
{
try
{
//read in the pdf
reader = new PdfReader(inputPdf);
//all pdfs must have 2 pages (front and back).
//set to throw an out of bounds error if not. caught up stream
for (int i = 1; i <= Math.Min(reader.NumberOfPages,2); i++)
{
//import the page from source pdf
copiedPage = copyPdf.GetImportedPage(reader, i);
// add the page to the new document
copyPdf.AddPage(copiedPage);
}
//cleanup this page, keeps a big memory leak away
copyPdf.FreeReader(reader);
copyPdf.Flush();
}
finally
{
reader.Close();
}
}
Give the PdfSmartCopy a try. Not sure if it's faster or not.
Document document = new Document();
using(PdfWriter writer = new PdfSmartCopy(document, new FileStream("merged.pdf", FileMode.Create)))
{
document.Open();
PdfReader reader = null;
foreach(string thisFile in files)
{
reader = new PdfReader(thisFile);
((PdfSmartCopy)writer).AddPage(writer.GetImportedPage(reader , 1));
((PdfSmartCopy)writer).AddPage(writer.GetImportedPage(reader , 2));
}
if(reader != null)
{
reader.Close();
}
}
document.Close();

WIA + network scanner with adf = 1 page

I am writing a program to work with a network scanner through WIA.
Everything works fine when scanning only one page. When I turn on the feeder:
foreach (WIA.Property deviceProperty in wia.Properties)
{
if (deviceProperty.Name == "Document Handling Select")
{
int value = duplex ? 0x004 : 0x001;
deviceProperty.set_Value(value);
}
}
the program receives a scan, the signal that there are still documents in the feeder and falls off with com error (scanner continues to scan).
Here's the code check the pages in the feeder:
//determine if there are any more pages waiting
Property documentHandlingSelect = null;
Property documentHandlingStatus = null;
foreach (Property prop in wia.Properties)
{
if (prop.PropertyID == WIA_PROPERTIES.WIA_DPS_DOCUMENT_HANDLING_SELECT)
documentHandlingSelect = prop;
if (prop.PropertyID == WIA_PROPERTIES.WIA_DPS_DOCUMENT_HANDLING_STATUS)
documentHandlingStatus = prop;
}
if ((Convert.ToUInt32(documentHandlingSelect.get_Value()) & 0x00000001) != 0)
{
return ((Convert.ToUInt32(documentHandlingStatus.get_Value()) & 0x00000001) != 0);
}
return false;
Getting the picture code:
imgFile = (ImageFile)WiaCommonDialog.ShowTransfer(item, wiaFormatJPEG, false);
Unfortunately could not find an example of using WIA WSD. Perhaps there are some settings to get multiple images through WSD.
I had almost the same problem using WIA 2.0 with vba to control a Brother MFC-5895CW Multi-Function Scanner.
When I transferred scans from the ADF I was not capable to catch more than 2 pictures to image-objects (and I tried probably every existing option and worked days and hours on that problem!)
The only solution I found with that scanner was to use the ShowAcquisitionWizard-method of the WIA.CommonDialog-Object to batch-transfer all scanned files to a specified folder. It was more a workaround than a satisfying solution for me because the postprocessing would have become more complicated.
Surprise surprise, I tried the same procedure on the neat-scanner of my client... ShowAcquisitionWizard delivered only one scanned page to the specified folder, the other pages disappeared.
To my second surprise with the 'CommonDialog.ShowTransfer'-method I was able to transfer all scanned documents picture by picture into image-objects in my application.

mouse double click is not working quite good

I am using the following code to record screen, when recording, when using mouse to double click some item, for example double click a ppt to open it in PowerPoint, it is not very responsive. I have tried and it is much better when using screen recording function of Windows Media Encoder 9. Any ideas what is wrong?
My environment: Windows Vista + Windows Media Encoder 9 + VSTS 2008 + C#. I wrote the following code in the initialization code of a Windows Forms application, and I suspect something wrong with my Windows Forms application?
My code,
IWMEncSourceGroup SrcGrp;
IWMEncSourceGroupCollection SrcGrpColl;
SrcGrpColl = encoder.SourceGroupCollection;
SrcGrp = (IWMEncSourceGroup)SrcGrpColl.Add("SG_1");
IWMEncVideoSource2 SrcVid;
IWMEncSource SrcAud;
SrcVid = (IWMEncVideoSource2)SrcGrp.AddSource(WMENC_SOURCE_TYPE.WMENC_VIDEO);
SrcAud = SrcGrp.AddSource(WMENC_SOURCE_TYPE.WMENC_AUDIO);
SrcVid.SetInput("ScreenCap://ScreenCapture1", "", "");
SrcAud.SetInput("Device://Default_Audio_Device", "", "");
// Specify a file object in which to save encoded content.
IWMEncFile File = encoder.File;
string CurrentFileName = Guid.NewGuid().ToString();
File.LocalFileName = CurrentFileName;
CurrentFileName = File.LocalFileName;
// Choose a profile from the collection.
IWMEncProfileCollection ProColl = encoder.ProfileCollection;
IWMEncProfile Pro;
for (int i = 0; i < ProColl.Count; i++)
{
Pro = ProColl.Item(i);
if (Pro.Name == "Screen Video/Audio High (CBR)")
{
SrcGrp.set_Profile(Pro);
break;
}
}
encoder.Start();
thanks in advance,
George
I faced the same problem. But the problem doesn't reside in your code or mine. When I tried to capture screen from Windows Media Encoder application itself I faced the same problem too in about 50% of the sessions. It's evident that it's a bug in WindowsMediaEncoder itself.
George
Here are a couple options (from http://www.windowsmoviemakers.net/Forums/ShowPost.aspx?PostID=1982):
Enable the MouseKeys Accessibility option, and type + to double-click
Run the encoder and target application on different machines, and capture a remote desktop session

Resources