"Understanding the Linux® Virtual Memory Manager" new edition? - linux-kernel

Is there any new updated version of this classic book planned ?
I'd love to see it being updated to 3.2 or later. It'll make a classic reference with Linux Device Drivers, 4th edition which also is going to be based on 3.2.
Its pain to relate to map old data structures to newer ones while reading this.

Related

What is the "TBD Release Iron" and what are the modifications?

Some Win32 API function documentation (for example this and this) contains the following note:
Starting with TBD Release Iron, the behavior of this and other NUMA
functions has been modified to better support systems with nodes
containing more that 64 processors. For more information about this
change, including information about enabling the old behavior of this
API, see NUMA Support.
What exactly is the "TBD Release Iron"?
Which Windows versions does it support?
What modifications does the note refer to?
Elsewhere, for example on https://learn.microsoft.com/en-us/windows/win32/procthread/numa-support, it's called
Windows 10 Build 20348
In the same note.
So it looks like the folks at MSDN MS Docs Learn have some mass replacing to do.
As for the actual change, there now are (as is tradition) Ex methods for NUMA that add support for processor groups, allowing you to specify affinity for more applications running on machines with more than 64 logical processors, if I interpret it correctly.
"To Be Determined (TBD) release" means Microsoft has not decided exactly when this feature is going to be shipped. Iron is a branch codename. According to Betawiki, Iron was 21H1 and included Windows Server 2022.
Why Microsoft expects people to keep up with their internal codenames and if the changes have shipped and they just forgot to update the page to use the publuc name or if the feature/change has not shipped yet, I don't know.
The changes as compared with Windows 7 are listed under "Behavior starting with TBD Release Iron" on each page...

Migrate to Xamarin.IOS Unified API

Recently i got a project which was build on Xamarin on Mac.Now, when i try to open this project on visual studio for MAC (As you all know Xamarin is now visual studio for MAC) it shows some errors regarding Monotouch.
The question is : Do i really need to convert app to unified API ? i know there is tutorial on official Xamarin doc to change app to unified API, but if there is any other way to open app without migrating to unified api.And what will be advantages and disadvantages of migration?
There are quite a few reasons why you might consider updating, but I will highlight some of the more important ones. Firstly consider that Apple as a manufacturer of hardware and software have always striven to keep their devices upto date, as such lagging behind as an iOS app developer can absolutely effect the demand for your app.
Firstly it already became a push or jump situation, as Xamarin stopped updating or supporting feature additions to their 'classic api' (As of writing we are on iOS 10.3).
The complete removal of classic support is scheduled for next fall
with the release of Xamarin.iOS 10.0.
Secondly the unified API is required to meet apples desire to support 64bit architecture:
The new Unified APIs are required to support 64 bit device
architectures from a Xamarin.iOS mobile application. As of February
1st, 2015 Apple requires that all new app submissions to the iTunes
App Store support 64 bit architectures.
As to your concern regarding the dissadvantages, I will simply say that the migration can either go smoothly, or not so smoothly. It's worth bearing in mind that the 'unified api' uses different native data types which may require some work arounds depending on the current structure of your original code.
The biggest point is what I mentioned earlier, in Apples App Store if you linger behind in terms of keeping your app up to date with the latest SDK, API, or anything else Apple decide to upgrade, then it is akin to giving up on that application.
I've put together some links below that may aid you in the migration process:
Native Types - Describes the new native data types that you will need to use in a Unified API app.
32/64 bit Platform Considerations - Considerations in choosing 32-bit and 64-bit modes for your application.
Updating Existing iOS Apps - Follow these steps to update an existing Xamarin.iOS app to use the Unified API.
Binding Objective-C Libraries - This document describes the process used to create C# bindings of Objective-C APIs and how the idioms in Objective-C are mapped to the idioms used in .NET.If you are binding just C APIs, you should use the standard .NET mechanism for this, the P/Invoke framework.
Binding Definition Reference Guide - This is the reference guide that describes all of the attributes available to binding authors to drive the binding generation process.
Updating UI Components - This is a guide to the process for updating UI componenets to the latest versions within the unified api.

Several versions of Delphi

For reasons related to customers with different applications versions i need to maintain in my laptop several versions of Delphi (7, XE 7, XE 8, XE 10.1 Berlin and XE 10.2 Tokyo). My main concern is about PATH variable and problems during compilation and linking time. There'll be any problems ? Do i need to change what ? Any suggestion is most welcome.
I have all versions of Delphi from 7 thru XE8 installed in a single VM and versions 1 thru 6 in another (my Delphi "museum" :)).
The Delphi "museum" is a Windows XP VM to avoid the problems that those older versions of Delphi have with more recent Windows versions. The Delphi 7+ VM started life as Windows 7 VM but has since been upgraded to Windows 8.x and then Windows 10 without any problem.
The two sets of VM's are kept separate in this way to avoid OS complications with those older versions and because I use 1-6 only very, very rarely and version 7+ more often. The precise version at which the "cut-off" was made was determined by the fact that dotted unit names were also only supported from version 7 onward so a lot of the code I ever wrote for 7+ is simply not even usable with 1-6 so there's no point having them alongside each other.
In both cases the IDE/compilers (any version) have no intrinsic problems running alongside other versions.
The only real difficulty is installing Delphi 2006 on Windows Vista (or later). Should you ever need to, this is the only one that presents any real difficulty due to a dependency on .NET which is not handled very well by the installer. But it is do-able and not especially difficult as long as you follow the steps described in detail by Dr. Bob.
Install Locations: Minimising PATH Length/Manageability
With a large number of Delphi versions installed the overall length of the PATH variable can become a problem, but in my experience this is a problem only of manageability. To simplify things on that score and to avoid problems with earlier versions of Delphi on more recent versions of Windows, I installed all my IDE's in a sub-folder directly off the root:
c:\delphi\<version>
Where version is each Delphi version number (e.g. 7.0, 2007, 2009, XE, XE2 etc etc). I then have a number of other folders for shared components:
c:\delphi\bde
c:\delphi\database desktop
c:\delphi\shared files
When I setup the VM I installed each Delphi version in order and changed the installation locations for these components to these locations. In this way there is one common installation of these shared components which is updated by each more recent version as required.
I also have a c:\delphi\common\ folder where I keep things such as pre-compiled FastMM_FullDebugMode.dll etc to be shared across all Delphi versions.
I did all this primarily for my own benefit however, to keep things organised and consistent rather than to solve any particular problem (apart from the previously mentioned issues affecting older versions if installed under Program Files).
e.g. if you simply install into default locations then you will end up with versions "scattered" across Borland, CodeGear and Embarcadero folders. All my IDE versions are in one place.
With or without these considerations, the IDE should be perfectly happy to run all the different versions you mention without any particular configuration required, but you may need to pay attention to configuration/assumptions made by some 3rd party packages/libraries.
3rd Party Packages
Most 3rd party libraries/packages are usually fine, but there may be the occasional one that needs a bit of help. I myself have never come across anything that couldn't be resolved but have to say that I also don't use 3rd party libraries particularly extensively so simply may not have come across any "trouble makers".
In any event, it's unfortunately difficult to give general advice on this point since it obviously depends very much on the 3rd party libraries and the particular "problems" that any particular one might have.
I have all Delphi versions from 6 to 10.2 installed on a computer running Windows 8.1 64 bit. It's not easy to setup, especially for the older versions. The first rule would be: Do not install to "c:\program files", use a separate directory (I use "c:\delphi" with a numerical subdirectory for each version.)
That has two effects:
Older versions, that still write to the installation directory, will work.
The path entries will not be as long (even though, they will be too long after the 5th or 6th Delphi install, see the comments to your question for possible solutions)
Why did I not use multiple VMs? I maintain GExperts for the versions mentioned above and it is too much hassle to maintain the VMs. As long as it works, I will keep all Delphi versions on my computer. If it stops working, I will probably drop GExperts suppport for some Delphi versions.
There are multiple articles on getting older Delphi versions to work on Windows 8.1. They might be useful if you try it.
They are all in the category Windows 8.1:
https://blog.dummzeuch.de/category/windows/windows-8-1/

Statistics on Xcode users adoption rate per version

Hope that this is the right forum to ask this question. If not please help me by redirecting to the best place to ask this.
I'm looking for usage statistics of Xcode by version.
More specifically, with which versions of Xcode are new apps, or updates, compiled with.
Googling this doesn't bring anything relevant.
This is of interest to me since I am supplying a Swift compiled framework which requires a specific Swift runtime per compilation version.
Xcode 7.3.x was updated to use Swift 2.2 compiler rather than version 2.1 used in Xcode 7.2.x.
I would like to fully migrate to 2.2 , but only when there is a high enough adoption rate of Xcode 7.3.

Does a newly produced mac application need to support 10.4, and can I both support 10.4 and prepare for 64bit?

My company is in the process of rewriting our software from scratch, and I'm the one who is going to be doing most of the work in rewriting the Mac client (The core of our software is Windows based, and the Mac client communicates with it through a webservice).
This isn't a real heavy app, mainly does some background work tracking stuff and a UI component for the user to enter information.
I'm trying to decide how hard I should argue for dropping support for 10.4 and going with pure 10.5+/Obj-C 2.0 code.
My main motivations for this are:
It would be easier to code, I could use all the features of Obj-C 2.0 such as synthesized properties and fast enumeration.
It would give me access to several classes, and methods in existing classes, that don't exist in 10.4 (Just in mocking up a UI I've come across NSPathControl and NSTreeNode, both of which I would otherwise be very happy to use.
Preparing for the conversion to 64 bit coming in Snow Leopard. It seems like most of the techniques for preparing for the move to 64 bit (NSInteger, etc) are only available in 10.5+, and it would not be possible to use these if writing for 10.4.
The downside would of course be that we'd no longer be supporting an operating system that was only a year out of date.
My boss is himself supportive of this move, but of course has our customers to consider and doesn't want to cause any more issues for them than are justified. The director of support would like to support 10.4. I suspect the other execs will be marginally against it at first, just due to the not being able to support some customers thing. Everybody would be open to persuasion by a good argument from either side.
I'm trying to talk to some of the support people and get an idea of how many of our customers are actually still using 10.4, but I don't have that data yet.
Some kind of hybrid solution might be possible, such as rewriting parts of the old client to use the new webservice, or writing the client in 10.5 and backporting it to 10.4 if enough people made a fuss, but quite frankly those sound like they're likely to be even more trouble than giving up the 10.5 features and writing the code in 10.4 to begin with.
So I guess my questions are as follows:
Given the information above, do you think making a case for the adoption of 10.5+ only is the right thing to do? Do you have any suggestions as to how this might be presented positively to the rest of the company?
I don't know as much about the coming 64 bit transition as I'd like. Does anybody have any good references on what will be different, and do you think that supporting only 10.5+ would make this transition easier for us?
If it were I doing the update, I would target 10.5, especially since 10.6 is just around the corner and 10.5 did come out with a lot of great, new things (especially Objective-c 2.0). However, I think you really need to answer this question based on what you think your target customer group will be using. If they are slow to adopt new technology, it may be that you have to support 10.4 or risk losing a portion of your customer base.
On the other hand, you can actually target 10.4 and write using the 10.5 SDK. That way you can take advantage of all the preparations for 64-bit added to the SDK. You just have to ensure that you don't use any classes or features of the frameworks that didn't exist in 10.4. You can also do weak linking to the 10.5 frameworks and programatically decide whether you can use a new feature or not (while this is a bit of extra work up front, you can easily phase 10.4 support out of your code in the future and take full advantage of 10.5 improvements for users that actually are running 10.5).
There are a lot of blogs and write-ups about doing the cross-platform stuff out on the web. The other thing to keep in mind is that if you do target 10.4 make sure you have a 10.4 machine available to do a lot of testing (especially if you compile from the 10.5 SDK to take advantage of the 64-bit ready features). Also check the docks for any feature you may want to use from the 10.5 SDK. Many features were actually available in 10.4 but undocumented and the new documentation usually states which features you can safely use when deploying to 10.4
Do you need 64-bit? Unless your application is very CPU-intensive, it won't make any difference.
Tiger can run 64-bit applications, but without GUI. If you need 64-bit, you can create 64-bit CLI executable that does heavy lifting and provide 32-bit font-end for it (using NSTask and NSPipe).
You can also have separate .nib files for Leopard and Tiger:
-(id)init
{
BOOL tiger = floor(NSAppKitVersionNumber) <= NSAppKitVersionNumber10_4;
NSString nibname = (tiger ? #"WindowTiger" : #"WindowLeopard");
if (self = [super initWithWindowNibName:nibname])
…
You really need to find out what your customers are using, and the support person is probably best positioned to know, or the product manager. That said there's nothing wrong with making the technical arguments clear now even if 90%+ of your user base were pre-Leopard; that way the issues will be known (and hopefully understood) so you'll have more support as the environment does change.
I never wrote production code in Objective-C and its hard to keep up, but as far as i am aware NSInteger and friends are in 10.4, it's just that Cocoa isn't 64 bit in 10.4 whereas in 10.5 most of it is (so no more need for seperate 64bit worker process under a 32bit UI).
I don't know what your product is, or who your customers are, but from my experience, Mac users are early adopters (relatively speaking) I've never used an OS X version longer than two weeks before the next upgrade was out, and in my circle I am a late adopter. Ofcourse I'm not just a business Mac user and that may well make a big difference.
What makes 64bits a requirement in your code? There's not much of a reason to not compile a universal binary holding as many architectures as you wish you could have one binary run on G4, G5, IA32 and IA64 no problem, and have it be native on all of them. If you're just doing 64bits because you can there's no reason (that I can imagine) not to keep supporting 32bits, but if you want stuff like CoreAnimation you don't have much choice.
I don't think it's wrong to demand 10.5 for new development, but it wouldn't make much business sense to force a whole new OS on customers just to keep using your existing product. So if you can, stay compatible, maybe backport your new features/patches for a time. There is a good reason for forking in version control and this might be it.
edit-
Since I posted this I learned that I was wrong and NSInteger did not exist before 10.5. I think I assumed too much having used similar types (like NSDecimal) earlier.

Resources