Finding probability for atleast - probability

I have a doubt in a question which is
There are 5 components(C1,C2,C3,C4,C5) in a system and the probability that a component will work is q. i have to find the probability that at least three components works in the system?

Related

Algorithm to place N agents into M shelters with minimal cost

TL;DR: Short problem description:
I am looking for an efficient algorithm that optimizes how N agents, located in 2D space, can be placed into M shelters by minimizing the distance the agents need to travel.
Each shelter can only hold 1 agent. If N > M (more agents than available shelters), then some agents will not get placed into shelters (all agents are the same).
(Optional simplification: while agents can be freely located in 2D space, shelters are always arranged on a square grid. No agent is located outside of the convex hull of shelters.)
This is all you need to know. However, if you think that this problem has no efficient solution then here is ...
a more specific (and to me most relevant) version of the problem:
There are exactly 9 shelters, arranged on a square grid (with distance d). All N agents are located around the central shelter (in a box of size d*d centered around central shelter). However, in this case, the central shelter is always empty but all other shelters may or may not be available (empty) at the beginning.
For this case, I need an algorithm that solves the problem of arbitrary many agents N (typically N < 9) and arbitrary shelters being available (either all 9, or in the extreme case only the central shelter).
The algorithm should be efficient, since I need to solve many of these problems quickly.
Example:
Here is an example with N=3 agents (black dots) and M=5 available shelters (green dots). The red dots show non-available shelters. ]1 I use letters for shelters and numbers for agents.
What I did so far:
I am sure that this problem has a specific name and has been solved/studied already, but I cannot find its name or any solutions. I need to solve many of those problems fast and I always want the optimal solution (if thats not possible, an almost optimal solution is also sufficient). Here is what I tried/thought of so far:
Brute force: I know that the optimal solution to the problem can be found with brute force by checking all possible options, calculating the total travel distance for each and picking the option with smallest total travel distance. This may involve many computations if M and N are large.
A fast but very non-optimal solution works as follows: for each agent i, calculate the distance to central node E. Starting from the agent i with smallest distance to E, assign i to its closest shelter (in this case: E). Then assign the next agent to its closest shelter, considering that E is now not available anymore, etc, until all agents are assigned or stop if no more free shelters are available. This works, is fast, but of course produces non-optimal results (in the example image: 2->E, 1->B, 3->F, while the optimal solution should be 3->E, 2->F, 1->B)
Another idea I'm working on is to first find the agents that are under the most "pressure", i.e. all of their good options are far away. Starting with the agent under highest pressure, assign it to the closest shelter. Continue for all other agents. However, I am not sure how to properly define "pressure" for this problem, as it likely should be a combination of the distances to the first few shelters. Also, I am not sure that this will lead to the optimal solution, but may result in an almost optimal solution.
I am trying to think of this problem as some sort of weighted permutation, that is, I need to select N shelters and map them to the N agents, but each mapping comes at a cost. I need to minize the total cost, but I have no idea how to do this.
Also, I am thinking of some sort of Simulated Annealing, or some form of push-and-pull algorithm where each shelter is attracting agents, or agents are attracted to shelters based on their distance. While this may sound interesting, I would expect that this is computationally not efficient.
I am happy for any input, especially if this problem already has a proper name and solutions. I am also happy for a simple and fast-to-compute algorithm that achieves an almost optimal solution.
As suggested in the comments (thanks again!), this is indeed answered by this post.
Specifically, this is an assignment problem which gets solved by the Hungarian algorithm, considering agents as workers, shelters as tasks and the cost of worker i doing task j being the Manhatten distance between agent i and shelter j.
The python package munkres implements this algorithm and is very fast for the 9-shelter problem. If there are more shelters than agents, the package handles it automatically. For the case of more agents than shelters I am satisfied with deleting random agents until the number of agents is equal to the number of shelters. Therefore my problem is solved.

independent times to ensure minimum cut of graph at least one trial succeeds

I just finished with the first module in the algo specialization course in coursera.
There was an exam question that i could not quite understand. I have passed that exam, so there's no point for me to retake it.
Out of curiosity, I want to learn the principles around this question.
The question was posted as such:
Suppose that a randomized algorithm succeeds (e.g., correctly computes
the minimum cut of a graph) with probability p (with 0 < p < 1). Let ϵ
be a small positive number (less than 1).
How many independent times do you need to run the algorithm to ensure
that, with probability at least 1−ϵ, at least one trial succeeds?
The options given were:
log(1−p)/logϵ
log(p)/logϵ
logϵ/log(p)
logϵ/log(1−p)
I made two attempts and both were wrong. My attempts were:
log(1−p)/logϵ
logϵ/log(1−p)
It's not so much I want to know the right answer. I want to learn the principles behind this question and what it's asking for. So that I know how to answer similar questions in future.
I have posted this on the forum, but nobody answered after a month. So I am trying it out here.
NO need to post the answer directly. If you got me to get to aha moment, i will mark it as correct.
Thanks.
How many independent times do you need to run the algorithm to ensure that, with probability at least 1−ϵ, at least one trial succeeds?
Let's rephrase it a bit:
What is the smallest number of independent trials such that the probability of all of them failing is less than or equal to ϵ?
By the definition of independent events, the probability of all of them occurring is the product of their individual probabilities. Since the probability of one trial failing is (1-p), the probability of n trials failing is (1-p)^n.
This gives us an inequality for n:
(1-p)^n <= ϵ

Prepare a schedule so that all courses are taught in the least time

I encountered one interview question:
There are some professors, some courses, and some students.
Each professor can teach only a single course.
Each course has a fixed duration(Eg. 10 weeks).
For each professor, you are given time availability schedule(assume week wise).
Each student has a list of courses he wants to learn.
There can be only 1:1 classes, i.e., 1 professor can teach only a single student.
A student can attend only one course at a time.
A professor has to finish teaching a course in a one go.
Your aim is to prepare a schedule so that all courses are taught in the least time.
My Approach: I mentioned that this will be solved via graph theory.Like make a directed edge from teacher to course or teacher to student.But I was not able to solve it completely .
Is my approach correct or is it DP problem?
Pseudocode or Algorithm suggestions?
The problem you were asked is a schedulling problem, which is a dynamic programming problem. In particular, your problem is what is usally called FJm|brkdwn,pj=10|Cmax, wich can be traduced as follow:
There are m machines (the professors) that can process a part of a job (here, a job is the full teaching of a student) independently and in whatever order. Some machines may process the same part of a job (the same course)
machines are not continuously available
the duration of a part of a job (a course) is 10 weeks
you want to minimize the time completion of all jobs
There exist solvers that are well optimized for schedulling problems, but I am not sure if to model your problem as a scheduling problem and to process it through a shedulling problem solver is what was intended by your job interviewer.
This is similar to the m-coloring problem. Except here we are asked to return the minimum m. Unfortunately, it's an NP-hard problem.
For the given problem, consider a course as a vertex and edge b/w 2 vertexes if a common student exists or professor is the same.
Now first, find the upper bound of m (minimum colors required) using Welsh–Powell Algorithm and then we can do a binary search to find which is the smallest value of m for which we are able to color all the vertex (with no 2 adjacent vertexes with the same color) using Graph Coloring

Is there a name for the algorithm solutions for "booking rooms with the least room switches"?

I was discussing with a co-worker a problem we were having with a piece of software we deploy, and he mentioned how it was similar to the conceptual problem of booking rooms over a course of time and the algorithm should output the room bookings that requires the least switches (so for example, an optimal solution may be staying in one room for 3 days, and the rest in another room, only requiring two switches).
Is there a name for such a problem in algorithms?
Originally I posted something regarding the minimum set cover problem. Although you can describe your problem as a minimum set cover problem, if we assume "room bookings" are over consecutive days, your problem can be more succinctly described with a different problem.
The interval cover problem1 consists of one big interval (call it (a,b)), and a bunch of subintervals (call them (ai, bi)). Our goal is to cover the one big interval with as few subintervals as possible.
Finding the minimal coverage of an interval with subintervals is a question posted about 5 years ago which asks for an efficient solution, and the accepted answer shows that the greedy solution is optimal. Within the context of room bookings, the "greedy solution" would be basically to start from the beginning of the period and always pick the booking with the latest end date.
The idea of course with this problem is that the each "subinterval" is a booking, so the fewer subintervals we need, the fewer bookings, and hence the fewer "switches" we need.
1 I'm not actually 100% sure that this is the correct name, but if you were to say "interval cover problem", the listener would probably think of the same thing.

Optimal selection election algorithm

Given a bunch of sets of people (similar to):
[p1,p2,p3]
[p2,p3]
[p1]
[p1]
Select 1 from each set, trying to minimize the maximum number of times any one person is selected.
For the sets above, the max number of times a given person MUST be selected is 2.
I'm struggling to get an algorithm for this. I don't think it can be done with a greedy algorithm, more thinking along the lines of a dynamic programming solution.
Any hints on how to go about this? Or do any of you know any good websites about this stuff that I could have a look at?
This is neither dynamic nor greedy. Let's look at a different problem first -- can it be done by selecting every person at most once?
You have P people and S sets. Create a graph with S+P vertices, representing sets and people. There is an edge between person pi and set si iff pi is an element of si. This is a bipartite graph and the decision version of your problem is then equivalent to testing whether the maximum cardinality matching in that graph has size S.
As detailed on that page, this problem can be solved by using a maximum flow algorithm (note: if you don't know what I'm talking about, then take your time to read it now, as you won't understand the rest otherwise): first create a super-source, add an edge linking it to all people with capacity 1 (representing that each person may only be used once), then create a super-sink and add edges linking every set to that sink with capacity 1 (representing that each set may only be used once) and run a suitable max-flow algorithm between source and sink.
Now, let's consider a slightly different problem: can it be done by selecting every person at most k times?
If you paid attention to the remarks in the last paragraph, you should know the answer: just change the capacity of the edges leaving the super-source to indicate that each person may be used more than once in this case.
Therefore, you now have an algorithm to solve the decision problem in which people are selected at most k times. It's easy to see that if you can do it with k, then you can also do it with any value greater than k, that is, it's a monotonic function. Therefore, you can run a binary search on the decision version of the problem, looking for the smallest k possible that still works.
Note: You could also get rid of the binary search by testing each value of k sequentially, and augmenting the residual network obtained in the last run instead of starting from scratch. However, I decided to explain the binary search version as it's conceptually simpler.

Resources