Johnson JW, Mahmood F, Xu W, Durr N, Yuille A. Structured Prediction Using cGANs with Fusion Discriminator. International Conference on Learning Representations Workshop on Deep Generative Models for Highly Structured Data [Internet]. 2019. Publisher's VersionAbstract
We propose a novel method for incorporating conditional information into a generative adversarial network (GAN) for structured prediction tasks. This method is based on fusing features from the generated and conditional information in feature space and allows the discriminator to better capture higher-order statistics from the data. This method also increases the strength of the signals passed through the network where the real or generated data and the conditional data agree. The proposed method is conceptually simpler than the joint convolutional neural network - conditional Markov random field (CNN-CRF) models and enforces higher-order consistency without being limited to a very specific class of high-order potentials. Experimental results demonstrate that this method leads to improvement on a variety of different structured prediction tasks including image synthesis, semantic segmentation, and depth estimation.
Johnson JW. Teaching Neural Networks in the Deep Learning Era. J. Comput. Sci. Coll. 2019;34 (6) :16-25.Abstract
This paper describes the design and evaluation of the first iteration of a standalone course in neural networks aimed at upper level undergraduates and first-year graduate students. The development of this course was motivated by recent state-of-the-art results on challenging tasks in computer vision and natural language processing that have been obtained using deep neural networks, and the subsequent widespread adoption of these models for various applications in industry. The course design emphasizes theoretical understanding and development of applications following existing best practices. Throughout, many unsettled aspects of the underlying mathematical theory of deep neural networks are highlighted, and students are prepared to adapt as current trends and techniques evolve.
Johnson JW. Scaling Up: Introducing Undergraduates to Data Science Early in Their College Careers. J. Comput. Sci. Coll. [Internet]. 2018;33 (6) :76–85. Publisher's VersionAbstract

It has historically been the case that most data science and analytics programs are offered at the Master of Science level. What few undergraduate offerings exist are frequently limited to either a standalone course or a small number of courses targeted to upper level undergraduates. Literature on how best to teach data science to undergraduate students is practically nonexistent. We review recent work on establishing standards and learning objectives for undergraduate data science education, and we make the case that undergraduate students should be exposed to data science early in their college career. We describe the strategy used to teach an introductory course in data science aimed not at upper-level students, but at undergraduate students in their first or second year of study. This course assumes no prerequisite knowledge in computing, mathematics, or statistics, aligns well with recently outlined objectives for undergraduate data science education, and has a track record of success for five consecutive semesters.

Johnson J. Neural Style Representations and the Large-Scale Classification of Artistic Style, in Proceedings of the 2017 Future Technologies Conference (FTC). ; 2017 :283-285. Publisher's VersionAbstract
The artistic style of a painting can be sensed by the average observer, but algorithmically classifying the artistic style of an artwork is a difficult problem. The recently introduced neural-style algorithm uses features constructed from the low-level activations of a pretrained convolutional neural network to merge the artistic style of one image or set of images with the content of another. This paper investigates the effectiveness of various representations based on the neural style algorithm for use in algorithmically classifying the artistic style of paintings. This approach is compared with other neural network based approaches to artistic style classification. Results that are com-petitive with other recent work on this challenging problem are obtained.
Johnson JW. Data Science & Computing Across the Curriculum. J. Comput. Sci. Coll. [Internet]. 2017;32 (6) :187–188. Publisher's VersionAbstract
Data science is an applied, interdisciplinary field that draws on technical skills at the intersection of computing, applied mathematics, and statistics. In addition to these technical skills, the successful data scientist is typically also an expert in an unrelated field where data analysis can be utilized. The interdisciplinary nature of data science presents a unique opportunity to introduce broadly applicable fundamentals of computer science to a larger audience than is typically found in our computer science courses and programs. This poster describes the approach to introductory data science instruction recently taken at the University of New Hampshire.
Johnson JW. Weight ideals associated to regular and log-linear arrays. Journal of Symbolic Computation [Internet]. 2015;67 :1 - 15. Publisher's VersionAbstract
Certain weight-based orders on the free associative algebra R = k〈x1,…,xt〉 can be specified by t×∞ arrays whose entries come from the subring of positive elements in a totally ordered field. If such an array satisfies certain additional conditions, it produces a partial order on R which is an admissible order on the quotient R/I, where the ideal I is a homogeneous binomial ideal called the weight ideal associated to the array. The structure of the weight ideal is determined entirely by the array. This article discusses the structure of the weight ideals associated to two distinct types of arrays which define admissible orders on the associated quotient algebra.
Johnson JW. The number of group homomorphisms from Dm into Dn. The College Mathematics Journal. 2013;44 (3) :190–192.Abstract
We count the number of group homomorphisms between any two dihedral groups using elementary group theory only.