I'm using the open-source Cyberlink UPnP framework in my app.
How can I link this framework to my app, so that other user who try to install my application won't need to also install the framework?
I tried to use the source files, and link them, but I'm having problems with that. I read this programming guide for that framework: http://www.cybergarage.org/pdfdoc/clinkobjcproguide.pdf
What's the best way to do this?
Related
In Flutter's official web page, Flutter is described as below:
"Flutter is Google’s UI toolkit for building beautiful, natively compiled applications for mobile, web, and desktop from a single codebase."
My question is
1. Is it just for cross-platform UIs?
2. If it is, then how can you integrate with data models and all hardware features?
Thanks in advance!
Flutter is not just a framework that you can build great UI with. It uses the programming language Dart and the code is compiled to native platform code. It's not just for UI development as Google uses Flutter to make some of its applications like Stadia. Other companies like the New York Times build their apps with Flutter and those apps have functionality and they're entirely built with the Flutter framework.
So back to your question. It's not just for building beautiful UI's. Even in the documentation, it says:
natively compiled applications
So you can use Flutter to make cross-platform, native applications, not just for UI. It supports popular things like Redux, BLoC pattern, and many more for the reactivity of your app.
I recommend you take a look here to see some of the apps fully built with just the Flutter framework.
Flutter is UI plus business logic which means frontend, hence Flutter is a frontend SDK. Many people say that Dart is used for backend, but that’s not true, Dart is purely used for frontend logic and Flutter is indeed, 100% frontend. But…native Android and native iOS development too are “just” frontend, purely.
Because, the server side logic of any app is written using a backend language/framework which is not a responsibility for a frontend person (Android, iOS, Flutter or any other developer). Backend technologies to be used always depends on the company’s preferences. And remember, writing your backend in either of these frontend languages is not good for long-term lifespan of your app, most companies are not stupid, that they don’t use Kotlin or Dart or Swift in the backend to save money, they know that, ultimately, doing that will cost them even more than hiring backend devs separately because it is very rare (almost impossible) to find expert people who can code backends in Dart/Kotlin/Swift.
While JS has many existing, widely adopted stacks for backend, so the above is not same for web devs, they’ll be paid more for being a full-stack person.
I am trying to make an app in Xamarin.forms that needs to be able to detect text from images, and I decided to use Firebase ML Kit. How do I use ML Kit with Xamarin.forms, not just Xamarin android? If I can't, is there an alternative I can use with Xamarin iOS?
I can't see any Firebase MLKit package for Xamarin.Forms. There are only packages for Xamarin.Firebase.ML.Vision->Xamarin.Android and Xamarin.Firebase.iOS.MLKit->Xamarin.IOS.
I think you should use alternatives like Microsoft Cognitive Service-> Computer Vision or Tesseract package. I had a change to implement both and Azure service's recognition is much better than Terresact. On the other hand Tesseract has an advantage, it can work offline and faster.
There are 2 ways to implement Microsoft Cognitive Service. First one is using their packages and other one is using rest service. Similar result. Tesseract is offline, so you should use its package.
There don't seem to be any APIs / examples of how to integrate WebRTC in Xamarin. There is a third party API (IceLink) by a company named Frozen Mountain Software but one requires a paid license to use it.
Any clue as to how to do this ?
You will have to use WebRTC native libraries for Xamarin. If you are good with using precompiled libraries then you may find it over web. However I prefer to compile webrtc native for each target platform. Google has documented steps very well, but it usually take time.
Once you have libraries ready for your platform then you can use it with xamarin. Let me know if you face any issue.
I'm trying to build a hybrid app with some native features like geotagging, notifications and offline storage. So far in my research I got the feeling that I will need xCode to access the native features on iOS.
Is there a way that I can skip that step? I have found that Cordova API, supports native features with JavaScript, but also I have found some contradicting statements witch suggest that I also need to also use xCode.
Thank you
If you want to build an iOS-App you'll need Xcode's command line tools to build, run and deploy your App. So you can't skip this step.
The Steroids tooling lets you develop your app without the need for Xcode (or Android Studio). You use a companion app from App Store to develop locally, then an online Build Service to create a stand-alone package. The wrapper provides access to many native APIs, including all Cordova core plugins.
(Disclaimer: I work for AppGyver.)
I would like to create a framework(Like coreData,CoreAudio,etc ) which can be used in multiple applications.
Can anyone post the links or tutorial for this ...
Try Apple's Framework Programming Guide.
Note that sometimes shared code doesn't make sense in a framework. A lot of documentation about a framework assumes that you're going to install the framework on the computer, not embed it in the application.