Self-supervised neural language models have recently achieved unprecedented success, from natural language processing to learning the languages of biological sequences and organic molecules. These models have demonstrated superior performance in the generation, structure classification, and functional predictions for proteins and molecules with learned representations. However, most of the masking-based pre-trained language models are not designed for generative design, and their black-box nature makes it difficult to interpret their design logic. Here we propose BLMM Crystal Transformer, a neural network based probabilistic generative model for generative and tinkering design of inorganic materials. Our model is built on the blank filling language model for text generation and has demonstrated unique advantages in learning the "materials grammars" in terms of high-quality generation, interpretability, and data efficiency. It can generate chemically valid materials compositions with as high as 89.7% charge neutrality and 84.8% balanced electronegativity, which has more than 4 and 8 times higher enrichment compared to enhanced random sampling. The probabilistic generation steps allow it to recommend generation or tinkering actions with explanation, which captures known materials chemistry and makes it useful for materials doping. Our models can be trained with fewer than 40,000 materials formulas demonstrating their high data efficiency compared to other pre-trained protein or molecule models trained with millions of samples. We have applied our model to discover a set of new materials as validated using DFT calculations. Our work thus not only brings the unsupervised transformer language models based generative artificial intelligence to inorganic materials but also has the potential to guide the development of better generative design models in the domain of biology (proteins) and organic molecules. A user-friendly web app has been developed and can be accessed freely at www.materialsatlas.org/blmtinker.