Posts

Showing posts from August, 2017

Around Here Thirty-Four 8/19-8/25

Image
A glimpse into what it is like to live in our home just this week. Intentional Outdoor Hours : 434+ hours (of 1000) Up fourteen hours this week, I got in a few walks with the dogs, we ran through the wind in the yard right before a big rainstorm hit last weekend (the kids were cracking up as the leaves blew off the trees around them), and we watched the lunar eclipse on Monday. We had only about 79% coverage up here in western PA, but it was still pretty cool and gave off a weird eerie lighting during the peak of it. On Friday night, the youth football players were recognized on the field and then the kids had a blast playing with all their friends in the grass area during the game.  Brandon and I kept saying we were leaving any minute for two hours (!) and then we finally didn't actually end up heading home until the game was totally over and we were some of the last cars left in the parking lot - just too much fun catching up and chatting with friends and family! Reading not so m

Linear classifiers

In this tutorial, we study the behavior of 5 linear classifiers on artificial data. Linear models are often the baseline approaches in supervised learning. Indeed, based on a simple linear combination of predictive variables, they have the advantage of simplicity: the reading of the influence of each descriptor is relatively easy (signs and values of the coefficients); learning techniques are often (not always) fast, even on very large databases. We are interested in: (1) the naive bayes classifier; (2) the linear discriminant analysis; (3) the logistic regression; (4) the perceptron (single-layer perceptron); (5) the support vector machine (linear SVM). The experiment was conducted under R. The source code accompanies this document. My idea, besides the theme of the linear classifiers that concerns us, is also to describe the different stages of the elaboration of an experiment for the comparison of learning techniques. In addition, we show also the results provided by the linear appr

Around Here: Thirty-Three 08/11-08/18

Image
A glimpse into what it is like to live in our home just this minute. photo cred: Steph Oakes (thank you for the pic text!) Intentional Outdoor Hours: 420+ hours (of 1000) ugh. Only up 13 hours from last week - we are feeling pretty pulled at the seams with this end of summer last minute mania, new job start up, curriculum planning, and day care searching spiral of death we are in right now (HAHAH). I did get a few walks in with the pups, and some of our fun activities and visits we had lent themselves to outdoor time. Reading , ugh, again.  Same issues with the spiral of death comment above.  I have been trying to squeeze in a chapter or two of Emma by Jane Austen for my local book club when I can (the chapters are nice and short so I can get them in while I wait for a pot to boil or while I drink my first few sips of coffee.  But it's not been too fruitful of a reading week. Playing with cousins at our annual Uzelac reunion.  We spent Saturday with our extended family catching

Discriminant analysis and linear regression

Linear discriminant analysis and linear regression are both supervised learning techniques. But, the first one is related to classification problems i.e. the target attribute is categorical; the second one is used for regression problems i.e. the target attribute is continuous (numeric). However, there are strong connections between these approaches when we deal with a binary target attribute. From a practical example, we describe the connections between the two approaches in this case. We detail the formulas for obtaining the coefficients of discriminant analysis from those of linear regression. We perform the calculations under Tanagra and R. Keywords : linear discriminant analysis, predictive discriminant analysis, multiple linear regression, wilks' lambda, mahalanobis distance, score function, linear classifier, sas, proc discrim, proc stepdisc Components : LINEAR DISCRIMINANT ANALYSIS, MULTIPLE LINEAR REGRESSION Tutorial : en_Tanagra_LDA_and_Regression.pdf Programs and dataset

a chat with Grey and Gem

Image
I had a chat with Grey and Gem this weekend and started with 'remember a long time ago when white people thought black people should be slaves' and followed with a chat about the civil war and Martin Luther King Jr, and how that was all so crazy and such a long time ago and Grey said, "Mum! We know about this already, why are we talking about it, it makes me feel mad?!" and I said, we're talking about it because something bad happened in Virginia and it's about people who think that having white skin is the best skin. They asked questions, like "you mean this wasn't a long time ago, but now?" and I told them about the statue being removed and the Confederate flags and how the marchers were upset that they were being removed even though what they stood for was a disgraceful and hurtful part of our history.  They asked lots of "but why?" to which I offered that maybe they didn't read enough, or travel enough, or have enough friends th

Gradient boosting with R and Python

This tutorial follows the course material devoted to the “Gradient Boosting” to which we are referring constantly in this document. It also comes in addition to the supports and tutorials for Bagging, Random Forest and Boosting approaches (see References). The thread will be basic: after importing the data which are split into two data files (learning and testing) in advance, we build predictive models and evaluate them. The test error rate criterion is used to compare performance of various classifiers. The question of parameters, particularly sensitive in the context of the gradient boosting, is studied. Indeed, there are many parameters, and their influence on the behavior of the classifier is considerable. Unfortunately, if we guess about the paths to explore to improve the quality of the models (more or less regularization), accurately identifying the parameters to modify and set the right values are difficult, especially because they (the various parameters) can interact with eac

Around Here Week Thirty-Two: 08/04-08/10

Image
A glimpse into what it is like to live in our home just this minute. Intentional Hours Outdoors: 407+ hours (of 1000) I totally geeked out this week on my dog walks and how beautiful the corn field looked in the golden hour light...you may have seen on my insta stories totally dedicated to Mother Earth, hah. We had a perfect day at the 'beach' at Quemahoming with the Stiffler crew and Grey naturally wanted to fish the entire time while the rest of us waded in the water and the kids went to town on a mountain of a new sand pile! B and the kids had a couple trips into the woods to check trail cams while I logged some yard laps jogging and listening to the Showtime Spanish podcast (so fun!) Reading Sisterland by Curtis Sittenfeld and starting Emma by Jane Austen for my local book club group! Getting in a ton of cousin time with playdates and special outings.  The smallest two Garretsons and Caleb spent Saturday morning at our house.  It felt so much like autumn outside that G