Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
When i have high resolution image. (512,512,3) or (150,150,3) which input shape outperform?
In general it depends on what you want to achieve and how you define "outperform".
(512, 512, 3) will lead to better performance on some kind of metric like. accuracy, IoU ... (depending on the task), (150, 150, 3) will lead to worse performance but better runtime and less memory consumption.
Related
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 days ago.
Improve this question
code refactoring can improve the overall quality and efficiency of software development processes. It should be integrated as an ongoing process in the software development life cycle.
i was expecting real anser for this
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
Now I am struggling in rendering point cloud in WebGL. Once the file size get larger than 200 MB, the browser will crash. I am now using Points in Three.js to render my data.
Is there any way to optimise it or good library I can use to do this case?
This is very broad topic where you can do many ridiculous optimizations.
Take a look on this http://www.potree.org/
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
What is the fastest classification algorithm?
If I have a very large data set and a very large number of features and I need to conduct classification on it within about an hour.
What would be the fastest classification algorithm?
What would be the fastest classification algorithm?
It would depend on the nature of your data, size and dimensions.
Moreover, it would depend on the trade-off you want to achieve between speed and accuracy.
There is no single best algorithm for all cases.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
What kind of data structure should be used for nearest neighbor searching in 2d dimension?
I have searched and found out that there are many data structures for this: k-d tree, quadtree, octree.
So what kind of structure should I use?
I suggest a R-Tree, it's designed for that purpose.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Can anybody help me in getting some useful ideas for estimating the size of a hadoop cluster and cluster hardware.
Well, it all depends on the amount of data you are handling and what exactly you are catering to. But, you can refer to this pdf here, to get more insight on how you can estimate.