Philip T. Reisshttps://works.bepress.com/phil_reiss/Recent works by Philip T. Reissen-usCopyright (c) 2018 All rights reserved.Mon, 01 Jan 2018 00:00:00 +00003600A time-varying measure of dyadic synchrony for three-dimensional motionhttps://works.bepress.com/phil_reiss/45/<div class="line" id="line-5"><span style="font-family: CMR12; font-size: 12pt;">We propose a novel approach to the analysis of synchronized three-dimensional motion in dyads. Motion recorded at high time resolution, as with a gaming device, is preprocessed in each of the three spatial dimensions by spline smoothing. Synchrony is then defined, at each time point, as the cosine between the two individualsâ€™ estimated velocity vectors. The approach is extended to allow a time lag, allowing for the analysis of leader-follower dynamics. Mean square cosine over the time range is proposed as a scalar summary of dyadic synchrony, and this measure is found to be positively associated with cognitive empathy. </span></div>Philip T. Reiss et al.Mon, 01 Jan 2018 00:00:00 +0000https://works.bepress.com/phil_reiss/45/PreprintsCross-sectional versus longitudinal designs for function estimation, with an application to cerebral cortex developmenthttps://works.bepress.com/phil_reiss/44/<div class="line" id="line-17">Motivated by studies of the development of the human cerebral cortex, we consider</div><div class="line" id="line-19">the estimation of a mean growth trajectory and the relative merits of cross-sectional</div><div class="line" id="line-21">and longitudinal data for that task. We define a class of relative efficiencies that</div><div class="line" id="line-23">compare function estimates in terms of aggregate variance of a parametric function</div><div class="line" id="line-25">estimate. These generalize the classical design effect for estimating a scalar with</div><div class="line" id="line-27">cross-sectional versus longitudinal data, and in particular cases are shown to be</div><div class="line" id="line-29">bounded above by it. Turning to nonparametric function estimation, we find that a</div><div class="line" id="line-31">longitudinal fits may tend to have higher aggregate variance than cross-sectional</div><div class="line" id="line-33">ones, but that this may occur because the former have higher effective degrees of</div><div class="line" id="line-35">freedom reflecting greater sensitivity to subtle features of the estimand. These ideas</div><div class="line" id="line-37">are illustrated with cortical thickness data from a longitudinal neuroimaging study.</div>Philip T. ReissMon, 01 Jan 2018 00:00:00 +0000https://works.bepress.com/phil_reiss/44/Published and in-press articlesMethods for scalar-on-function regressionhttps://works.bepress.com/phil_reiss/40/<div class="line" id="line-17">Recent years have seen an explosion of activity in the field of functional data analysis (FDA), in which curves, spectra, images, etc. are considered as basic functional data units. A central problem in FDA is how to fit regression models with scalar responses and functional data points as predictors. We review some of the main approaches to this problem, categorizing the basic model types as linear, nonlinear and nonparametric. We discuss publicly available software packages, and illustrate some of the procedures by application to a functional magnetic resonance imaging dataset.</div>Philip T. Reiss et al.Tue, 01 Aug 2017 00:00:00 +0000https://works.bepress.com/phil_reiss/40/Published and in-press articlesPointwise influence matrices for functional-response regressionhttps://works.bepress.com/phil_reiss/43/<div class="line" id="line-47">We extend the notion of an influence or hat matrix to regression with functional responses and scalar predictors. For responses depending linearly on a set of predictors, our definition is shown to reduce to the conventional influence matrix for linear models. The pointwise degrees of freedom, the trace of the pointwise hat matrix, are shown to have an adaptivity property that motivates a two-step bivariate smoother for modeling nonlinear dependence on a single predictor. This procedure adapts to varying complexity of the nonlinear model at different locations along the function, and thereby achieves better performance than competing tensor product smoothers in an analysis of the development of white matter microstructure in the brain. </div><div class="line" id="line-49"><br></div>Philip T. Reiss et al.Sun, 01 Jan 2017 00:00:00 +0000https://works.bepress.com/phil_reiss/43/Published and in-press articlesForgetting first words on a list may signal mental declinehttps://works.bepress.com/phil_reiss/36/Rachael RettnerFri, 08 Mar 2013 00:00:00 +0000https://works.bepress.com/phil_reiss/36/Blogging and press coveragerefund: Regression with Functional Datahttps://works.bepress.com/phil_reiss/32/Ciprian M. Crainiceanu et al.Tue, 01 Jan 2013 00:00:00 +0000https://works.bepress.com/phil_reiss/32/SoftwareMassively Parallel Nonparametrics [HDS 2011 slides]https://works.bepress.com/phil_reiss/23/Philip T. Reiss et al.Sun, 01 May 2011 00:00:00 +0000https://works.bepress.com/phil_reiss/23/PresentationsFast Function-on-Scalar Regression with Penalized Basis Expansionshttps://works.bepress.com/phil_reiss/16/<p>Regression models for functional responses and scalar predictors are often fitted by means of basis functions, with quadratic roughness penalties applied to avoid overfitting. The fitting approach described by Ramsay and Silverman in the 1990s amounts to a penalized ordinary least squares (P-OLS) estimator of the coefficient functions. We recast this estimator as a generalized ridge regression estimator, and present a penalized generalized least squares (P-GLS) alternative. We describe algorithms by which both estimators can be implemented, with automatic selection of optimal smoothing parameters, in a more computationally efficient manner than has heretofore been available. We discuss pointwise confidence intervals for the coefficient functions, simultaneous inference by permutation tests, and model selection, including a novel notion of pointwise model selection. P-OLS and P-GLS are compared in a simulation study. Our methods are illustrated with with an analysis of age effects in a functional magnetic resonance imaging data set, as well as a reanalysis of a now-classic Canadian weather data set. An R package implementing the methods is publicly available.</p>
Philip T. Reiss et al.Fri, 01 Jan 2010 00:00:00 +0000https://works.bepress.com/phil_reiss/16/Published and in-press articlesOptimizing the Expected Overlap of Survey Samples via the Northwest Corner Rulehttps://works.bepress.com/phil_reiss/8/<p>In survey sampling there is often a need to coordinate the selection of pairs of samples drawn from two overlapping populations so as to maximize or minimize their expected overlap, subject to constraints on the marginal probabilities determined by the respective designs. For instance, maximizing the expected overlap between repeated samples can stabilize the resulting estimates of change and reduce the costs of first contacts; minimizing the expected overlap can avoid overburdening respondents with multiple surveys. We focus on the important special case in which both samples are selected by simple random sampling without replacement (SRSWOR) conducted independently within each stratum. Optimizing the expected sample overlap can be formulated as a linear programming problem known as a transportation problem (TP). We show that by appropriately grouping and ordering the possible samples in each survey, one can reduce the initial TP to a much smaller TP amenable to solution by an algorithm known as the Northwest Corner Rule (NWCR). The proposed NWCR method proceeds in two easily implemented steps: first selecting the numbers of births (new units) and deaths (deleted units) by a random selection from a hypergeometric distribution, and then selecting the births and deaths by SRSWOR. We formally prove properties of the NWCR solutions, including a minimal variance property of the minimal overlap solution. In a simulation study, the NWCR method compares favorably with a popular method based on assignment of permanent random numbers to each sampling unit.</p>
Lenka Mach et al.Fri, 01 Dec 2006 00:00:00 +0000https://works.bepress.com/phil_reiss/8/Published and in-press articlesRegression with Signals and Images as Predictorshttps://works.bepress.com/phil_reiss/11/<p>Signal regression and image regression, in which the outcomes are scalars and the predictors are one-dimensional signals or multidimensional images, are of interest in many scientific fields. The principal statistical challenge is how to reduce the dimension of the predictors in what would otherwise be a severely ill-posed problem. A pair of novel methods, functional principal component regression (FPCR) and functional partial least squares (FPLS), combine two existing approaches to the dimension reduction problem: selection of most relevant components, as is done in ordinary principal component regression (PCR) and partial least squares (PLS), and restriction of the coefficient function to the span of a spline basis. Chapter 1 outlines several existing signal regression methods and presents two ways to define FPCR/FPLS, based, respectively, on regularized components and regularized regression. Simulations and real data analyses with chemometric data in Chapter 2 demonstrate the strong performance of the regularized-regression form of FPCR/FPLS, compared with the regularized-components form as well as with existing methods. Chapter 3 discusses how to incorporate covariates in FPCR/FPLS, and presents some results on the generalized cross-validation and restricted maximum likelihood approaches to smoothing parameter selection for a broader class of semiparametric regression models. In Chapter 4, FPCR and FPLS are extended from signals to images and from linear to generalized linear models. Chapter 5 applies FPCR and FPLS, in their linear and logistic regression versions, to neuroimaging data sets in which the predictors are maps of serotonin receptor and transporter binding in the brain.</p>
Philip T. ReissSun, 01 Jan 2006 00:00:00 +0000https://works.bepress.com/phil_reiss/11/Dissertation