Sunday 30 November 2014

Sunday, November 30th, 2014

It's really been quite a busy few weeks, and I have a lot to discuss, based on the past classes and topics, so this SLOG will be quite a bit longer than some of my other ones. I'll start with the ever-feared big-oh and big omega and especially big theta proofs. I missed out on a class, so there is a bit of a gap when it comes to big-oh for me, so it's difficult for me to even begin speaking about my issues but I'll do my best. For starters, big-oh's general formula sticks in my head: There exists a c in positive real numbers and there exists a breakpoint B in natural numbers that for all natural numbers in n, n is greater than or equal to the breakpoint B and f(n) is less than or equal to g(n). I still don't fully understand what the breakpoint equates to; is it the smallest number the worst case of big-oh can fall under? It's just hard for me to grasp that concept entirely, which will be looked into in detail before the exam. Also, big theta is extremely confusing. I assume big theta to be a conglomeration of big oh and big omega, but how could there be an element bound above and below f(n)? In tutorial, there needed to exist two separate c's: one for big oh and one for big omega, each somehow falling in line to support the claim big theta makes. It's very frustrating, but I also need to study more on this for the test. Big omega itself is the opposite of big oh, so I would just need to look into one to make more sense of the other. To assist myself, I will be visiting the help center sometime this week for further instruction.
    Moving on from their concepts, the structure of the proof itself is simple: choose a c and a breakpoint, then dive into the antecedent to create an inequality that will be solved as equal in the end. Of course, the structure IS always the simplest part of a proof, and has not as much to do with the "proving" part of a proof. Point being made: I need lots of practice with big-oh in general, in all shapes and forms, as it is a different type of concept I have not yet encountered in any of my other courses. We skimmed over it in CSC148, but here it's broken down into full algorithmic detail, and I need to practice it for the test. The polynomial big-oh equations are that much more challenging, as there are more terms to deal with, which slightly confuses me, but I have faith in myself to be able to tackle these problems in my studies.
    Going towards more computational problems, halting really confuses me. It reminds me a lot of recursion in my CSC148 class, calling a function within itself to produce a type of looping effect. In proving these types of halting problems, I can't trace the code well enough in my head to easily solve these problems. Normally, not to be overconfident, I'm fairly good at tracing through a mental call stack in coding, but halting is something completely different: calling a function as a parameter to test whether or not the function itself will run a loop without halting. I do have the lecture notes, practice questions, and slides to reference in terms of this problem, and I have no trouble finding other examples of halting problems when i google it, so all these resources do help me a lot in terms of this issue.
    Finally, I'll talk about induction, but only briefly, as the topic of induction was almost immediately lost on me on the first day. I enjoy math and calculation when there is a clearcut answer to be found and taken from an equation, but induction assumes that if a set follows a certain pattern, you would be able to reach an answer. I don't understand this at all, and quite frankly, I;m not too sure on how to tackle this problem. I could always google it as I would with halting, which I will do later, but induction only appears later in CSC236, so I'm a bit lazier when it comes to searching up proper induction (as any good computer scientist would be!). Making base cases is of course the easier step in induction, but once Prof. Heap switched it around to do the inductive step first, I was even more confused than I originally was. I would need to see him in office hours to make sure I understand a lot of the concepts I have listed in this post, but I am certain that with practice, I'll have no trouble passing the course and moving on to larger and more difficult problems, that I will once again be able to solve.

Sunday 2 November 2014

Sunday November 2nd, 2014

    This week of class became much more difficult than anything else we have visited in the course as of yet. The concept itself is understandable: finding the worst case and proving a worst case on a number of steps based in an overestimate, and in an underestimate, that part is clear. What isn't clear to me is how we'd be able to recreate Professor Heap's thinking strategies and calculations in a test or on an assignment. The concept is quite difficult, but I feel with more attention and practice, I would be able to understand it further. I simply would study over the slides and talk to my friends about the big-oh methods. I could also relate this big-oh concept to other concepts I have previously learned to help, such as analyzing lists and using the python visualizer to track through its steps. For example, running a type of "simulation" in the visualizer using insertion sort, tracking through a list based on a claim from a previous test, and start by tracing the code for a few of its steps to analyze how the steps run, where they execute, and in what instances certain lines wouldn't run. On that topic of lines that wouldn't run, what really bothered me about this week was understanding how exactly certain lines ran a certain (n) number of times. I'll continue with my previous example of a "simulation" to further understand that as well. As for now, I'll continue to remain attentive and discuss with my classmates about how big-oh works, and I might even approach Prof. Heap after class to speak with him about the class material.

Saturday 25 October 2014

Saturday, October 25th, 2014

    This week, we covered more proofs, however this week was the "end" of proofs for the rest of the section. I honestly do see us using proofs later regardless of us being "finished" covering them. We ended up covering disproving and allowed inferences in more detail through the middle of the week. I found all the inferences extremely helpful to me, they will end up being used by not only me, but most students in the class as an exemplary study aid. A lot of the inferences given, although common sense, will still help many students, including me, when writing proofs on tests, speeding up our thinking processes. As for disproving, I found it quite easy to grasp: simply take the proof, and negate it, then prove that to disprove. I feel I may be wrong, as looking back at my notes, the statement doesn't look like a full negation; the example we took in class being what I'm referring to (For all x in a set of X, for all y in a set of Y, P(x,y) implies Q(x,y)). I'll need to look into this more in depth come next week of class, I'll end up talking to Prof. Heap about it. The rest of the week consisted of looking at sorting algorithms that I had recently learned about in CSC108 the previous year. Insertion sort and Selection sort were two of the algorithms that were referenced in class, I'd always had difficulty determining between the two, but, this year I feel I have a better grasp on them. As for Friday, we did a sorting exercise, which I found interesting, cutting the number of pennies in half from drawer to drawer to obtain a certain number. It was insightful as an exercise, but I feel I wouldn't be able to do something like that in a test environment. For this week, I felt more comfortable with the material, and I will continue to use the strategies I have listed in previous posts to keep comfortable.  

Monday 20 October 2014

Sunday October 19th, 2014

    To start off, obviously today's date isn't the 19th, but this is an update on the previous week. I had put off writing this post for a bit since I didn't feel I had learned anything different or felt that I personally didn't understand any more than I did the week before Thanksgiving week. To recap this past week, we continued running through proofs and learned a different way to structure them, which I found quite useful to trim away a good amount of time while writing proofs, which will come in handy come test/exam days. Using the tinier starting bits of proof to conclude will really help me in these situations. Also, throughout the week, we looked at several different ways to write proofs, one being by case. This one stood out to me, simply because I found the step unnecessary to prove for separate cases, but I digress. I mean, this type of proof will be useful, but to me at the moment, it seems like a bit of unnecessary work. I am still having difficulties with proofs, but I know I'll be able to understand them with time and practice.
    Such as today in tutorial, I had a breakthrough (or at least I hope I did) in writing a full proof. The question, although simple, was still conceptually difficult to me, so I tried my hardest to understand it, following the necessary steps: proving the "for all" quantifiers with a statement that all variables under this quantifier are generic in the set they are meant to belong to, setting up the necessary bookends, but then it came to choosing a number to represent a variable, and I froze for a while. I looked at the question more carefully and I read it out in plain english and the lightbulb went off almost immediately for what I had needed to do. I felt pretty good about it, but I'm still not 100% confident that I got the answer right, even though I feel my proof still holds well. As for now, I'll continue practicing and taking whatever results I get as a learning experience.

Saturday 11 October 2014

Saturday, October 11th, 2014

Good Afternoon,

    This week was surprisingly clearer than last week was when it came to proofs, although I am still extremely cloudy on how to properly structure and define them. Last week I was almost completely lost in the topic of proofs, both structure and definition. I didn't entirely understand the concepts related to existential and universal statements, but after this week, I realize that for a universal statement, you need to assume that that number n is a generic number of whichever set the universal statement points to. As well as for existential statements, I learned clearly that when referring to variable n as something that exists in some set, we need to prove that n exists, so we name n as some number/variable. That is what I took from lectures and tutorial, but I am not concrete on it.
    Speaking on the tutorial this week, I feel my quiz went better than in previous weeks simply because I spoke up and paid a little closer attention to the TA's examples and explanation. My quiz was breezier for me to run through than the previous 2, as the first two quizzes I felt I didn't take the necessary steps to really further my learning in checking over the examples and really reading into the material. Noting this, I'll make this a standard practice for the future to come in this course. There's nothing really more I can go into detail about as I have yet to fully comprehend proofs enough to write about them at length, so for now, I'll be ending this SLOG post until next week.

Damon Kissoon

Saturday 4 October 2014

Saturday, October 4th, 2014

Good Afternoon,

    I know this post is later than it should have been and I'm one week short of updates, but I had a sickness last week. I was diagnosed with (insert disease that I'd rather not share with the public here) which caused me to not be able to attend school or even easily get up out of bed for the week of September 21st. Since then, I've gotten a lot better, handed Prof. Heap a doctor's note, (nameless disease) is slowly working its virus out of my system thanks to modern medicine and good bed rest, and I was able to go to school this past week, albeit a little left out on certain techniques like DeMorgan's Law and other things I may have missed last week. That being left out, I can discuss things I may have missed, as well as things I have understood over this week.
    Let me start by talking about some things I may have left out in the previous SLOG, such as conjunction, disjunction and negation. These three topics I have encountered in previous courses such as CSC108, where, in Python, I had tweaked around with aspects such as programming using boolean operators. These aspects, through logic however, were also tweaked slightly. The "And" operator still held the same precedence: in Python, one statement is not True without the other statement also being True; logically this turns out to be the same in CSC165. However, "Or" operated slightly differently, where through Python, the entire statement would be true if either statement was true. In logic, or operates as an "and/or" type of inclusive statement, that involves both the antecedent and the consequent in making a statement true, though not really combining them. It's a very different concept to grasp, and I'm not entirely sure if I have the right handle on it, but this is what I pick up from disjunction. Negation was much easier on me however, switching out conjunction for disjunction, existential for universal, and P implying Q for P implying not Q. I found negation relatively easy simply because all you truly need to know is that the symbols flip, exists becomes for all, and the only way to negate an implication is by creating the statement as P and not Q.
    As for this past week, regardless on missing out on several things last week, I did pick up on some key concepts. During talking about constructing proofs, I learned about how we'd be able to switch around similar statements, as long as each statement involved the same universal or existential quantifier. For example: "For all m in a set of natural numbers, and for all n in a set of natural numbers, the product of m and n is a natural number." M and n's positions can be changed around so long as the quantifiers match up. The same goes for existential quantifiers, meaning the two statements can be flipped if the meaning changed to there exists for both statements.
    For concepts I still remain fuzzy on, I'm not entirely sure of how to fully construct proofs, but I'm hoping that Prof. Heap will go into more detail about these next week. What I do understand is how a proof is meant to be structured: statements that all imply something about the one before that fully prove the first implication. I am still unclear on how these statements are originally formed, but I think through the first implication, I would be able to get a proof going. Otherwise, it remains unclear. So, for this SLOG, I have parted with all of my ideas, and I hope they will somehow help me as a reflection on my previous knowledge, and hopefully help me to end up to learn further.

Friday 19 September 2014

Friday, September 19th, 2014

Hello,

    I'm Damon Kissoon and I'm beginning my SLOG for the second week of class in CSC165! Let me begin by saying that this is my second year going into computer science, as first year I had an extremely rocky beginning. However, I got back into computer science over the summer, and I'm working to achieve a spot in a subject post for third year! This year, I am taking CSC148 alongside CSC165, as well as MAT137 over the course of the Fall/Winter term, and some second year level courses in software design in the Winter. That being said, I have a lot on my plate.
    My other courses aside, I have been having problems here and there in CSC165 so far. I don't have an incredibly strong math background, so that is part of my beginner's frustration, but this week I found that I was catching on to the material a little bit more quickly with the help of some of my peers in class.
    For instance, I didn't fully understand how an empty set in the antecedent would elicit that the entire statement would be true. It all came down to my realization that simply, the only way to make a statement false would be for the antecedent to be true, and the consequent to be false. Take for example, "I choose not to speak, therefore I will stay silent." If I were to make this statement false, I would still choose not to speak, but I would end up not staying silent, making me not only a liar, but leaving the statement to be entirely untrue. Using the example used in class, where the antecedent was a set containing "All employees who make over 80 trillion dollars a year", and the consequent was a set containing those employees who "are male", the entire statement is true. I figured out that the only way this was to be false was if there were no counter-examples to the statement. Looking at the chart, the employees simply do not fit into the range of 80 trillion dollars, leaving no real counter-examples, providing the statement to be completely true, noting especially when the statement jumped to a more farfetched conclusion of "All employees who make over 80 trillion dollars a year have mauve eyeballs and breathe ammonia." This statement still appears true because of there already being no clear counter-examples to an employee that makes over 80 trillion dollars a year. I found that quite clear today in lecture, and was extremely glad I did, as it helped me understand implications themselves a little more deeply, and how even statements with a false antecedent can still be true even if the consequent is false or true.
    Ultimately this week, I am still a bit shaky when it comes to development of venn diagrams and the  illustration of sets containing other sets, as I don't fully understand where the set can be empty and where it contains at least something. I feel, however, that this skill will come with practice and a little bit more time, and I feel if I asked more questions in tutorial, I will get my answer. In general, although my programming background and math background are each a little shaky, I will work my hardest to succeed in fully understanding the way a logician's mind works.