More than 20 organisations use Conservation Action Planning (CAP), Healthy Country Planning and the Open Standards for the Practice of Conservation in over 140 projects, covering almost 160 million ha across Australia. This review documents the history, evolution and application of CAP in Australia and discusses its strengths, limitations and lessons learnt by users, including conservation planners, practitioners and policymakers.
Native fauna in Australia’s arid zone has declined significantly since European settlement; however, Martu country in the Western Desert of Western Australia retains a diversity of iconic and threatened species that were once more widespread. An innovative partnership between The Nature Conservancy, BHP Billiton and the Martu people (represented by Kanyirninpa Jukurrpa – KJ) is achieving positive social, cultural, economic and environmental outcomes, which builds on funding from the Australian Government for land management on Martu country. The partners support Martu people in fulfilling their desire to conserve the cultural and natural values of their 13.7 million ha native title determination area. Through KJ as the local delivery partner, Martu people are returning to work on country to clean and protect waterholes; improve fire management; control feral herbivores and predators; manage cultural heritage; and actively manage priority threatened species (such as the Greater Bilby and the Black-flanked Rock-wallaby). The project provides significant employment opportunities for Martu men and women in ranger teams working throughout their country. It is also generating measurable social, cultural and economic benefits for Martu people and environmental benefits for part of the most intact arid ecosystem anywhere on Earth.
Attention mechanisms and non-local mean operations in general are key ingredients in many state-of-the-art deep learning techniques. In particular, the Transformer model based on multi-head self-attention has recently achieved great success in natural language processing and computer vision. However, the vanilla algorithm computing the Transformer of an image with n pixels has O(n 2 ) complexity, which is often painfully slow and sometimes prohibitively expensive for large-scale image data. In this paper, we propose a fast randomized algorithm -SCRAM -that only requires O(n log n) time to produce an image attention map. Such a dramatic acceleration is attributed to our insight that attention maps on realworld images usually exhibit (1) spatial coherence and (2) sparse structure. The central idea of SCRAM is to employ PatchMatch, a randomized correspondence algorithm, to quickly pinpoint the most compatible key (argmax) for each query first, and then exploit that knowledge to design a sparse approximation to nonlocal mean operations. Using the argmax (mode) to dynamically construct the sparse approximation distinguishes our algorithm from all of the existing sparse approximate methods and makes it very efficient. Moreover, SCRAM is a broadly applicable approximation to any non-local mean layer in contrast to some other sparse approximations that can only approximate self-attention. Our preliminary experimental results suggest that SCRAM is indeed promising for speeding up or scaling up the computation of attention maps in the Transformer.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.