Generalized lifting: Difference between revisions
en>Shenme m "viceversa" --> "vice versa" |
|||
Line 1: | Line 1: | ||
In [[probability theory]] and [[information theory]], '''adjusted mutual information''', a variation of [[mutual information]] may be used for comparing clusterings.<ref name="vinh-icml09">{{cite doi|10.1145/1553374.1553511}}</ref> It corrects the effect of agreement solely due to chance between clusterings, similar to the way the [[adjusted rand index]] corrects the [[Rand index]]. It is closely related to [[variation of information]]:<ref>{{cite doi|10.1016/j.jmva.2006.11.013}}</ref> when a similar adjustment is made to the VI index, it becomes equivalent to the AMI.<ref name="vinh-icml09" /> The adjusted measure however is no longer metrical.<ref name="vinh-jmlr10">{{Citation | |||
| title = Information Theoretic Measures for Clusterings Comparison: Variants, Properties, Normalization and Correction for Chance | |||
| journal = The Journal of Machine Learning Research | |||
| pages = 2837–54 | volume = 11 | issue = oct | year = 2010 | |||
| last1 = Vinh | first1 = Nguyen Xuan | last2 = Epps | first2 = Julien | last3 = Bailey | first3 = James | |||
| url=http://jmlr.csail.mit.edu/papers/volume11/vinh10a/vinh10a.pdf}}</ref> | |||
==Mutual Information of two Partitions== | |||
Given a set ''S'' of ''N'' elements <math>S=\{s_1, s_2,\ldots s_N\}</math>, consider two [[partition of a set|partitions]] of ''S'', namely <math>U=\{U_1, U_2,\ldots, U_R\}</math> with ''R'' clusters, and <math>V=\{V_1, V_2,\ldots, V_C\}</math> with ''C'' clusters. It is presumed here that the partitions are so-called ''hard clusters;'' the partitions are pairwise disjoint: | |||
:<math>U_i\cap U_j = V_i\cap V_j = \varnothing</math> | |||
for all <math>i\ne j</math>, and complete: | |||
:<math>\cup_{i=1}^RU_i=\cup_{j=1}^C V_j=S</math> | |||
The [[mutual information]] of cluster overlap between ''U'' and ''V'' can be summarized in the form of an ''R''x''C'' [[contingency table]] <math>M=[n_{ij}]^{i=1 \ldots R}_{j=1 \ldots C}</math>, where <math>n_{ij}</math> denotes the number of objects that are common to clusters <math>U_i</math> and <math>V_j</math>. That is, | |||
:<math>n_{ij}=\left|U_i\cap V_j\right|</math> | |||
Suppose an object is picked at random from ''S''; the probability that the object falls into cluster <math>U_i</math> is: | |||
:<math>P(i)=\frac{|U_i|}{N}</math> | |||
The [[entropy]] associated with the partitioning ''U'' is: | |||
:<math>H(U)=-\sum_{i=1}^R P(i)\log P(i)</math> | |||
''H(U)'' is non-negative and takes the value 0 only when there is no uncertainty determining an object's cluster membership, ''i.e.'', when there is only one cluster. Similarly, the entropy of the clustering ''V'' can be calculated as: | |||
:<math>H(V)=-\sum_{j=1}^C P'(j)\log P'(j) </math> | |||
where <math>P'(j)=|V_j|/n</math>. The [[mutual information]] (MI) between two partitions{{citation needed|date=September 2011}}: | |||
:<math>MI(U,V)=\sum_{i=1}^R \sum_{j=1}^C P(i,j)\log \frac{P(i,j)}{P(i)P'(j)}</math> | |||
where ''P(i,j)'' denotes the probability that a point belongs to both the cluster <math>U_i</math> in ''U'' and cluster <math>V_j</math> in ''V'': | |||
:<math>P(i,j)=\frac{|U_i \cap V_j|}{N}</math> | |||
MI is a non-negative quantity upper bounded by the entropies ''H''(''U'') and ''H''(''V''). It quantifies the information shared by the two clusterings and thus can be employed as a clustering similarity measure. | |||
==Adjustment for chance== | |||
Like the [[Rand index]], the baseline value of mutual information between two random clusterings does not take on a constant value, and tends to be larger when the two partitions have a larger number of clusters (with a fixed number of set elements ''N''). | |||
By adopting a [[Hypergeometric distribution|hypergeometric]] model of randomness, it can be shown that the expected mutual information between two random clusterings is: | |||
:<math>\begin{align} E\{MI(U,V)\} = & | |||
\sum_{i=1}^R \sum_{j=1}^C | |||
\sum_{n_{ij}=(a_i+b_j-N)^+}^{\min(a_i, b_j)} | |||
\frac{n_{ij}}{N} | |||
\log \left( \frac{ N\cdot n_{ij}}{a_i b_j}\right) \times \\ | |||
& \frac{a_i!b_j!(N-a_i)!(N-b_j)!} | |||
{N!n_{ij}!(a_i-n_{ij})!(b_j-n_{ij})!(N-a_i-b_j+n_{ij})!} \\ | |||
\end{align}</math> | |||
where <math>(a_i+b_j-N)^+</math> | |||
denotes <math>\max(0,a_i+b_j-N)</math>. The variables <math>a_i</math> and <math>b_j</math> are partial sums of the contingency table; that is, | |||
:<math>a_i=\sum_{j=1}^Cn_{ij}</math> | |||
and | |||
:<math>b_j=\sum_{i=1}^Rn_{ij}</math> | |||
The adjusted measure<ref name="vinh-icml09"/> for the mutual information may then be defined to be: | |||
:<math> AMI(U,V)= \frac{MI(U,V)-E\{MI(U,V)\}} {\max{\{H(U),H(V)\}}-E\{MI(U,V)\}} | |||
</math>. | |||
The AMI takes a value of 1 when the two partitions are identical and 0 when the MI between two partitions equals to that expected by chance. | |||
==References== | |||
<references /> | |||
==External links== | |||
* [http://sites.google.com/site/vinhnguyenx/softwares Matlab code for computing the adjusted mutual information] | |||
[[Category:Information theory]] | |||
[[Category:Machine learning]] |
Revision as of 16:53, 3 December 2012
In probability theory and information theory, adjusted mutual information, a variation of mutual information may be used for comparing clusterings.[1] It corrects the effect of agreement solely due to chance between clusterings, similar to the way the adjusted rand index corrects the Rand index. It is closely related to variation of information:[2] when a similar adjustment is made to the VI index, it becomes equivalent to the AMI.[1] The adjusted measure however is no longer metrical.[3]
Mutual Information of two Partitions
Given a set S of N elements , consider two partitions of S, namely with R clusters, and with C clusters. It is presumed here that the partitions are so-called hard clusters; the partitions are pairwise disjoint:
The mutual information of cluster overlap between U and V can be summarized in the form of an RxC contingency table , where denotes the number of objects that are common to clusters and . That is,
Suppose an object is picked at random from S; the probability that the object falls into cluster is:
The entropy associated with the partitioning U is:
H(U) is non-negative and takes the value 0 only when there is no uncertainty determining an object's cluster membership, i.e., when there is only one cluster. Similarly, the entropy of the clustering V can be calculated as:
where . The mutual information (MI) between two partitionsPotter or Ceramic Artist Truman Bedell from Rexton, has interests which include ceramics, best property developers in singapore developers in singapore and scrabble. Was especially enthused after visiting Alejandro de Humboldt National Park.:
where P(i,j) denotes the probability that a point belongs to both the cluster in U and cluster in V:
MI is a non-negative quantity upper bounded by the entropies H(U) and H(V). It quantifies the information shared by the two clusterings and thus can be employed as a clustering similarity measure.
Adjustment for chance
Like the Rand index, the baseline value of mutual information between two random clusterings does not take on a constant value, and tends to be larger when the two partitions have a larger number of clusters (with a fixed number of set elements N). By adopting a hypergeometric model of randomness, it can be shown that the expected mutual information between two random clusterings is:
where denotes . The variables and are partial sums of the contingency table; that is,
and
The adjusted measure[1] for the mutual information may then be defined to be:
The AMI takes a value of 1 when the two partitions are identical and 0 when the MI between two partitions equals to that expected by chance.
References
- ↑ 1.0 1.1 1.2 Template:Cite doi
- ↑ Template:Cite doi
- ↑ Many property agents need to declare for the PIC grant in Singapore. However, not all of them know find out how to do the correct process for getting this PIC scheme from the IRAS. There are a number of steps that you need to do before your software can be approved.
Naturally, you will have to pay a safety deposit and that is usually one month rent for annually of the settlement. That is the place your good religion deposit will likely be taken into account and will kind part or all of your security deposit. Anticipate to have a proportionate amount deducted out of your deposit if something is discovered to be damaged if you move out. It's best to you'll want to test the inventory drawn up by the owner, which can detail all objects in the property and their condition. If you happen to fail to notice any harm not already mentioned within the inventory before transferring in, you danger having to pay for it yourself.
In case you are in search of an actual estate or Singapore property agent on-line, you simply should belief your intuition. It's because you do not know which agent is nice and which agent will not be. Carry out research on several brokers by looking out the internet. As soon as if you end up positive that a selected agent is dependable and reliable, you can choose to utilize his partnerise in finding you a home in Singapore. Most of the time, a property agent is taken into account to be good if he or she locations the contact data on his website. This may mean that the agent does not mind you calling them and asking them any questions relating to new properties in singapore in Singapore. After chatting with them you too can see them in their office after taking an appointment.
Have handed an trade examination i.e Widespread Examination for House Brokers (CEHA) or Actual Property Agency (REA) examination, or equal; Exclusive brokers are extra keen to share listing information thus making certain the widest doable coverage inside the real estate community via Multiple Listings and Networking. Accepting a severe provide is simpler since your agent is totally conscious of all advertising activity related with your property. This reduces your having to check with a number of agents for some other offers. Price control is easily achieved. Paint work in good restore-discuss with your Property Marketing consultant if main works are still to be done. Softening in residential property prices proceed, led by 2.8 per cent decline within the index for Remainder of Central Region
Once you place down the one per cent choice price to carry down a non-public property, it's important to accept its situation as it is whenever you move in – faulty air-con, choked rest room and all. Get round this by asking your agent to incorporate a ultimate inspection clause within the possibility-to-buy letter. HDB flat patrons routinely take pleasure in this security net. "There's a ultimate inspection of the property two days before the completion of all HDB transactions. If the air-con is defective, you can request the seller to repair it," says Kelvin.
15.6.1 As the agent is an intermediary, generally, as soon as the principal and third party are introduced right into a contractual relationship, the agent drops out of the image, subject to any problems with remuneration or indemnification that he could have against the principal, and extra exceptionally, against the third occasion. Generally, agents are entitled to be indemnified for all liabilities reasonably incurred within the execution of the brokers´ authority.
To achieve the very best outcomes, you must be always updated on market situations, including past transaction information and reliable projections. You could review and examine comparable homes that are currently available in the market, especially these which have been sold or not bought up to now six months. You'll be able to see a pattern of such report by clicking here It's essential to defend yourself in opposition to unscrupulous patrons. They are often very skilled in using highly unethical and manipulative techniques to try and lure you into a lure. That you must also protect your self, your loved ones, and personal belongings as you'll be serving many strangers in your home. Sign a listing itemizing of all of the objects provided by the proprietor, together with their situation. HSR Prime Recruiter 2010