If random variables X and Y are mutually independent, then their joint entropy H(X,Y) is just the sum H(X)+H(Y) of its component entropies. If they are not mutually independent, then their joint entropy will be H(X)+H(Y)-I(X;Y) where I(X;Y) is the mutual information of X and Y.
And I have another reason which I will express to my St Paul friends, and that is, that they have always been controlled in their political action here by a class of men who have always some traps set for the country members—by men who are always up to this border ruffian kind of trickery—this kind of skullduggery, as it is called in Minnesota. And now St. Paul allows herself to be controlled by these same border ruffian politicians.
The wag from the south of France was in immense force, and incessantly ejaculated ‘Vodki! Vodki!’ capering about with a glass of that liquor in his hand, and drinking and hobnobbing with everybody. I tried a glass of vodki,* and immediately understood what genuine blue ruin was. For this Vodki was bright blue, and it tasted—ugh! of what did it not taste?[…]* Or Vodka, both terminations seem to be used indifferently.
The faire Ophelia! Nymph, in thy Orizons / Be all my ſinnes remembred.