2016
DOI: 10.1111/sjos.12251
|View full text |Cite
|
Sign up to set email alerts
|

Exact Goodness‐of‐Fit Testing for the Ising Model

Abstract: Abstract. The Ising model is one of the simplest and most famous models of interacting systems. It was originally proposed to model ferromagnetic interactions in statistical physics and is now widely used to model spatial processes in many areas such as ecology, sociology, and genetics, usually without testing its goodness of fit. Here, we propose various test statistics and an exact goodness-of-fit test for the finite-lattice Ising model. The theory of Markov bases has been developed in algebraic statistics f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
7
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(8 citation statements)
references
References 32 publications
(51 reference statements)
1
7
0
Order By: Relevance
“…To our knowledge, no similar results had been obtained so far for Ising models even though there has been recent intensive work on Ising models in statistics (Foygel Barber & Drton, ; Martín del Campo et al., ; Bhattacharya & Mukherjee, ) and in machine learning (Murphy, ; Bresler, ; Johnson et al., ).…”
Section: Introductionmentioning
confidence: 61%
“…To our knowledge, no similar results had been obtained so far for Ising models even though there has been recent intensive work on Ising models in statistics (Foygel Barber & Drton, ; Martín del Campo et al., ; Bhattacharya & Mukherjee, ) and in machine learning (Murphy, ; Bresler, ; Johnson et al., ).…”
Section: Introductionmentioning
confidence: 61%
“…For example, [Dob12] proposes a general dynamic approach to constructing applicable local moves for marginal table models and proves that they can be utilized to cover the entire fiber. Meanwhile, [HAT12] use a Poisson-size combination of the smallest possible set of moves to explore fibers of discrete logistic regression models, and [MdCCU17] use another small subset of a Markov basis for testing the Ising model on a large biological dataset. Each of these methods extends the use of Markov bases to larger and larger datasets: the approach in [Dob12] was demonstrated on tables up to 256 cells, [HAT12] show their method approaches its limits on 10 × 10 × 10 tables, and [MdCCU17] are able to use their method on a biological dataset of size 800 × 800.…”
Section: Discussionmentioning
confidence: 99%
“…Meanwhile, [HAT12] use a Poisson-size combination of the smallest possible set of moves to explore fibers of discrete logistic regression models, and [MdCCU17] use another small subset of a Markov basis for testing the Ising model on a large biological dataset. Each of these methods extends the use of Markov bases to larger and larger datasets: the approach in [Dob12] was demonstrated on tables up to 256 cells, [HAT12] show their method approaches its limits on 10 × 10 × 10 tables, and [MdCCU17] are able to use their method on a biological dataset of size 800 × 800. In this paper, we are able to obtain good results for tables up of size 2661 × 2661 × 2 × 2 (networks with 2661 vertices) and show that our method slows down due to a goodness-of-fit statistic computation on tables of size 4334 × 4334 × 2 × 2 (networks with 4334 vertices).…”
Section: Discussionmentioning
confidence: 99%
“…Recent results, e.g., [56,32], show that for some very popular models of random graphs, Markov bases can be constructed by appropriately composing generators of the edge subring of a bipartite graph. The problem of the computational difficulty of Markov bases has been addressed for various models recently [59,34,32,10] by considering subsets of generators that can be applied to given data. Another direction of interest is how well Markov chains based on various types of bases behave; the interested readers should see, e.g., [55, Section 3.1] for references to the literature on mixing times.…”
Section: Substituting This Into the Previous Equation Givesmentioning
confidence: 99%