Concept: Polynomial interpolation
Electrocardiogram (ECG) based biometric matching suffers from high misclassification error with lower sampling frequency data. This situation may lead to an unreliable and vulnerable identity authentication process in high security applications. In this paper, quality enhancement techniques for ECG data with low sampling frequency has been proposed for person identification based on piecewise cubic Hermite interpolation (PCHIP) and piecewise cubic spline interpolation (SPLINE). A total of 70 ECG recordings from 4 different public ECG databases with 2 different sampling frequencies were applied for development and performance comparison purposes. An analytical method was used for feature extraction. The ECG recordings were segmented into two parts: the enrolment and recognition datasets. Three biometric matching methods, namely, Cross Correlation (CC), Percent Root-Mean-Square Deviation (PRD) and Wavelet Distance Measurement (WDM) were used for performance evaluation before and after applying interpolation techniques. Results of the experiments suggest that biometric matching with interpolated ECG data on average achieved higher matching percentage value of up to 4% for CC, 3% for PRD and 94% for WDM. These results are compared with the existing method when using ECG recordings with lower sampling frequency. Moreover, increasing the sample size from 56 to 70 subjects improves the results of the experiment by 4% for CC, 14.6% for PRD and 0.3% for WDM. Furthermore, higher classification accuracy of up to 99.1% for PCHIP and 99.2% for SPLINE with interpolated ECG data as compared of up to 97.2% without interpolation ECG data verifies the study claim that applying interpolation techniques enhances the quality of the ECG data.
Background: Infant body mass index (BMI) peak characteristics and early childhood BMI are emerging markers of future obesity and cardiometabolic disease risk, but little is known about their maternal nutritional determinants.Objective: We investigated the associations of maternal macronutrient intake with infant BMI peak characteristics and childhood BMI in the Growing Up in Singapore Towards healthy Outcomes study.Design: With the use of infant BMI data from birth to age 18 mo, infant BMI peak characteristics [age (in months) and magnitude (BMIpeak; in kg/m(2)) at peak and prepeak velocities] were derived from subject-specific BMI curves that were fitted with the use of mixed-effects model with a natural cubic spline function. Associations of maternal macronutrient intake (assessed by using a 24-h recall during late gestation) with infant BMI peak characteristics (n = 910) and BMI z scores at ages 2, 3, and 4 y were examined with the use of multivariable linear regression.Results: Mean absolute maternal macronutrient intakes (percentages of energy) were 72 g protein (15.6%), 69 g fat (32.6%), and 238 g carbohydrate (51.8%). A 25-g (∼100-kcal) increase in maternal carbohydrate intake was associated with a 0.01/mo (95% CI: 0.0003, 0.01/mo) higher prepeak velocity and a 0.04 (95% CI: 0.01, 0.08) higher BMIpeak These associations were mainly driven by sugar intake, whereby a 25-g increment of maternal sugar intake was associated with a 0.02/mo (95% CI: 0.01, 0.03/mo) higher infant prepeak velocity and a 0.07 (95% CI: 0.01, 0.13) higher BMIpeak Higher maternal carbohydrate and sugar intakes were associated with a higher offspring BMI z score at ages 2-4 y. Maternal protein and fat intakes were not consistently associated with the studied outcomes.Conclusion: Higher maternal carbohydrate and sugar intakes are associated with unfavorable infancy BMI peak characteristics and higher early childhood BMI. This trial was registered at clinicaltrials.gov as NCT01174875.
Based on the geo-statistical theory and ArcGIS geo-statistical module, datas of 30 groundwater level observation wells were used to estimate the decline of groundwater level in Beijing piedmont. Seven different interpolation methods (inverse distance weighted interpolation, global polynomial interpolation, local polynomial interpolation, tension spline interpolation, ordinary Kriging interpolation, simple Kriging interpolation and universal Kriging interpolation) were used for interpolating groundwater level between 2001 and 2013. Cross-validation, absolute error and coefficient of determination (R(2)) was applied to evaluate the accuracy of different methods. The result shows that simple Kriging method gave the best fit. The analysis of spatial and temporal variability suggest that the nugget effects from 2001 to 2013 were increasing, which means the spatial correlation weakened gradually under the influence of human activities. The spatial variability in the middle areas of the alluvial-proluvial fan is relatively higher than area in top and bottom. Since the changes of the land use, groundwater level also has a temporal variation, the average decline rate of groundwater level between 2007 and 2013 increases compared with 2001-2006. Urban development and population growth cause over-exploitation of residential and industrial areas. The decline rate of the groundwater level in residential, industrial and river areas is relatively high, while the decreasing of farmland area and development of water-saving irrigation reduce the quantity of water using by agriculture and decline rate of groundwater level in agricultural area is not significant.
We introduce current home Internet of Things (IoT) technology and present research on its various forms and applications in real life. In addition, we describe IoT marketing strategies as well as specific modeling techniques for improving air quality, a key home IoT service. To this end, we summarize the latest research on sensor-based home IoT, studies on indoor air quality, and technical studies on random data generation. In addition, we develop an air quality improvement model that can be readily applied to the market by acquiring initial analytical data and building infrastructures using spectrum/density analysis and the natural cubic spline method. Accordingly, we generate related data based on user behavioral values. We integrate the logic into the existing home IoT system to enable users to easily access the system through the Web or mobile applications. We expect that the present introduction of a practical marketing application method will contribute to enhancing the expansion of the home IoT market.
In the present work, a spline-based integration technique for the reconstruction of a freeform wavefront from the slope data has been implemented. The slope data of a freeform surface contain noise due to their machining process and that introduces reconstruction error. We have proposed a weighted cubic spline based least square integration method (WCSLI) for the faithful reconstruction of a wavefront from noisy slope data. In the proposed method, the measured slope data are fitted into a piecewise polynomial. The fitted coefficients are determined by using a smoothing cubic spline fitting method. The smoothing parameter locally assigns relative weight to the fitted slope data. The fitted slope data are then integrated using the standard least squares technique to reconstruct the freeform wavefront. Simulation studies show the improved result using the proposed technique as compared to the existing cubic spline-based integration (CSLI) and the Southwell methods. The proposed reconstruction method has been experimentally implemented to a subaperture stitching-based measurement of a freeform wavefront using a scanning Shack-Hartmann sensor. The boundary artifacts are minimal in WCSLI which improves the subaperture stitching accuracy and demonstrates an improved Shack-Hartmann sensor for freeform metrology application.
Photoplethysmographic signals are useful for heart rate variability analysis in practical ambulatory applications. While reducing the sampling rate of signals is an important consideration for modern wearable devices that enable 24/7 continuous monitoring, there have not been many studies that have investigated how to compensate the low timing resolution of low-sampling-rate signals for accurate heart rate variability analysis. In this study, we utilized the parabola approximation method and measured it against the conventional cubic spline interpolation method for the time, frequency, and nonlinear domain variables of heart rate variability. For each parameter, the intra-class correlation, standard error of measurement, Bland-Altman 95% limits of agreement and root mean squared relative error were presented. Also, elapsed time taken to compute each interpolation algorithm was investigated. The results indicated that parabola approximation is a simple, fast, and accurate algorithm-based method for compensating the low timing resolution of pulse beat intervals. In addition, the method showed comparable performance with the conventional cubic spline interpolation method. Even though the absolute value of the heart rate variability variables calculated using a signal sampled at 20 Hz were not exactly matched with those calculated using a reference signal sampled at 250 Hz, the parabola approximation method remains a good interpolation method for assessing trends in HRV measurements for low-power wearable applications.
- Journal of applied physiology (Bethesda, Md. : 1985)
- Published over 3 years ago
Heart rate variability (HRV) analysis is widely used to investigate the autonomic regulation of the cardiovascular system. HRV is often analyzed using RR time series, which can be affected by different types of artifacts. Although there are several artifact correction methods, there is no study that compares their performances in actual experimental contexts. This work aimed to evaluate the impact of different artifact correction methods on several HRV parameters. Initially, 36 ECG recordings of control rats or rats with heart failure or hypertension were analyzed to characterize artifacts occurrence rates and distributions, in order to be mimicked in simulations. After a rigorous analysis, only sixteen recordings (N=16) with artifact-free segments of at least 10.000 beats were selected. Then, RR interval losses were simulated in the artifact-free (reference) time series according to real observations. Correction methods applied to simulated series were deletion (DEL), linear interpolation (LI), cubic spline interpolation (CI), modified moving average window (mMAW) and nonlinear predictive interpolation (NPI). Linear (time- and frequency-domain) and nonlinear HRV parameters were calculated from corrupted-corrected time series, as well as for reference series to evaluate the accuracy of each correction method. Results show that NPI provides the overall best performance. However, several correction approaches, for example, the simple deletion procedure, can provide good performance in some situations, depending on the HRV parameters under consideration.
Anonymity, which is more and more important to multi-receiver schemes, has been taken into consideration by many researchers recently. To protect the receiver anonymity, in 2010, the first multi-receiver scheme based on the Lagrange interpolating polynomial was proposed. To ensure the sender’s anonymity, the concept of the ring signature was proposed in 2005, but afterwards, this scheme was proven to has some weakness and at the same time, a completely anonymous multi-receiver signcryption scheme is proposed. In this completely anonymous scheme, the sender anonymity is achieved by improving the ring signature, and the receiver anonymity is achieved by also using the Lagrange interpolating polynomial. Unfortunately, the Lagrange interpolation method was proven a failure to protect the anonymity of receivers, because each authorized receiver could judge whether anyone else is authorized or not. Therefore, the completely anonymous multi-receiver signcryption mentioned above can only protect the sender anonymity. In this paper, we propose a new completely anonymous multi-receiver signcryption scheme with a new polynomial technology used to replace the Lagrange interpolating polynomial, which can mix the identity information of receivers to save it as a ciphertext element and prevent the authorized receivers from verifying others. With the receiver anonymity, the proposed scheme also owns the anonymity of the sender at the same time. Meanwhile, the decryption fairness and public verification are also provided.
The popular, stable, robust and computationally inexpensive cubic spline interpolation algorithm is adopted and used for finite temperature Green’s function calculations of realistic systems. We demonstrate that with appropriate modifications the temperature dependence can be preserved while the Green’s function grid size can be reduced by about two orders of magnitude by replacing the standard Matsubara frequency grid with a sparser grid and a set of interpolation coefficients. We benchmarked the accuracy of our algorithm as a function of a single parameter sensitive to the shape of the Green’s function. Through numerous examples, we confirmed that our algorithm can be utilized in a systematically improvable, controlled, and black-box manner and highly accurate one- and two-body energies and one-particle density matrices can be obtained using only around 5% of the original grid points. Additionally, we established that to improve accuracy by an order of magnitude, the number of grid points needs to be doubled, whereas for the Matsubara frequency grid an order of magnitude more grid points must be used. This suggests that realistic calculations with large basis sets that were previously out of reach because they required enormous grid sizes may now become feasible.
This paper presents a novel, practical, and effective routine to reconstruct missing samples from a time-domain sequence of wirelessly transmitted IMU data during high-level mobility activities. Our work extends previous approaches involving empirical mode decomposition (EMD)-based and auto-regressive (AR) model-based interpolation algorithms in two aspects: 1) we utilized a modified sifting process for signal decomposition into a set of intrinsic mode functions with missing samples, and 2) we expand previous AR modeling for recovery of audio signals to exploit the quasi-periodic characteristics of lower-limb movement during the modified Edgren side step test. To verify the improvements provided by the proposed extensions, a comparison study of traditional interpolation methods, such as cubic spline interpolation, AR model-based interpolations, and EMD-based interpolation is also made via simulation with real inertial signals recorded during high-speed movement. The evaluation was based on two performance criteria: Euclidian distance and Pearson correlation coefficient between the original signal and the reconstructed signal. The experimental results show that the proposed method improves upon traditional interpolation methods used in recovering missing samples.