Posts

Showing posts from October, 2017

5.6, due Wed Nov 1 17

Difficult: The hardest part of this section was the sheer amount of material. There are so many doubly-defined words, and I'm not sure how I'm going to remember all of them, but I'm working on it. Application: Probability distributions of continuous random variables are really useful for a lot of applications. I've used them in research several times, to try and predict the probability of an event occurring. I just have to figure them out mathematically now...

5.5, due Mon Oct 30

Difficult: I have absolutely no idea why the weird "1_x" is important. If 1_x = X(w), why don't we just use X(w)? (pg 196) Application: I think I just read through the entirety of what I didn't fully understand in my stats class. It's amazing what a foundation of math will do to help you understand stuff. Also, probability distributions are cool.

5.4, due Fri Oct 27

Difficult: There was a lot of information in this section, and trying to keep it all straight is going to be a bit of a challenge. I don't really get all of the new notation yet. Application: I feel like this section started to move a little bit from probability into statistics. I guess the difference shouldn't be very big, but at the same time it might, I feel, be more intuitive to me. We'll see.

5.3, due Oct 25

Difficult: I don't fully understand the Unexample 5.3.4- do disjoint events always have Probability 0? Also, just in general, I'm trying to understand how you would do this. The homework doesn't make sense to me yet. Application: I liked the prosecutors fallacy. It seems like it would be really easy to mix up P(A|B) and P(B|A) if you aren't careful.

5.2, due Monday oct 23

difficult: I'm a bit confused by our adding of probabilities and such, especially with Bayes rule. I can see how it's calculated... wait. sorry. P(E intersect F) is not the same as P(E)P(F). this suddenly makes a lot more sense. application: Well, I feel that the examples in the book give some pretty good applications. But even with the examples, understanding this stuff well enough to do the homework is iffy.

Section 5.1, due Fri oct 20 17

difficult: This section wasn't very difficult to follow. It seemed mostly to follow the intuition that I already have of probability. Probably the hardest was realizing that Probability 1 is only an almost guaruntee, not a guaruntee, of an occurence. application: well, probability and statistics, etc, show up everywhere. I'm really excited to cover all of this material quickly, because we'll get to the interesting stuff and I hope to understand it better than before.

Study, due Wed Oct 18 17

For Wednesday October 18, as you study for the exam, write responses to the following questions. There is a study guide available here . Which topics and ideas do you think are the most important out of those we have studied? I think that the Theorums and Algorithms are the most important topics What kinds of questions do you expect to see on the exam? I expect to see a lot of computational problems - do something with a graph, huffman encode something, use dijkstra's algorithm to find the min distance in some graph, and also maybe a few proofs. But not many because definitions are pretty hand wavy. What do you need to work on understanding better before the exam? I need to figure out how to use each of the algorithms, and how to do proofs in this class. I feel like we've done less of them than in 344. Thinking about the answers to these questions can help guide your study.

4.4, due Monday Oct 16

Difficult: I had a hard time following what, exactly, NP-hard means. It felt like the definition wasn't as clear as the ones for P, NP, and NP-complete. Application: Approximation is an important thing. I would guess that (as Dr Jarvis talked about in his BYU speech), one of the biggest takeaways is that sometimes, approximations are acceptable - in industry, life, business, etc. That's good life advice right there.

4.3, due Friday 13 17

Difficult: Once again, the hardest part of the section was understanding the algorithm up until the example/picture was shown. Then it looks super cool. Application: Not directly related to the section, but looking through the study guide, I didn't realize how disjoint all of these topics are. It can make studying difficult, just because there's very little flow from one topic to another.

4.2, due Wed Oct 11

Difficult: The most difficult part was trying to follow the explanation of Dijkstra's algorithm as explained without any example (before the example was given). This was greatly improved, however, by the example being worked through. I keep trying to remember these concepts that I feel like I've learned, but they are taking a while to come back. Application: Well, besides the XKCD comic application (great inclusion, by the way. That should get published with the book), Dijkstra's algorithm has a lot of uses. Including, I would imagine, some application to the Traveling Salesman problem. I wonder how that would work, though...

4.1, due oct 9

Difficult: the word 'memoization' threw me off by a lot at the beginning. I don't quite follow bottom-up dynamic programming. Why would or could this ever be faster than recursion? also, the examples only half-help me. For instance, for the fib numbers, should you store the two you need for the next value each time? Application: Well, clearly, optimization is in everything. I like that we're starting to learn about it. Also, approximation making algorithms better is an interesting concept.

3.4, due friday oct 6

difficult: I didn't understand what heaps were before this, and understanding how the indexing on heap storage in an array works was a bit difficult. application: heaps are super useful. It's interesting that we can get almost the entire usefulness of a BST without the complexity of insertion and such (unless we need to sift the heap). Also, people's names for some data structures make me laugh.

3.3, due Wednesday Oct 4

difficult:  This is material that I have mostly already covered in CS 235, so it wasn't super difficult. Probably the hardest part was remembering how AVL rotations work. application: AVL trees are super useful. I remember having to implement a BST (and an AVI one not sure) in CS 235 from scratch. That took some time, but was fun.