Browsing
     by title


0-9 A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

or enter first few letters:   
OK
Full Text
Peer Reviewed
See detailCombined use of quantitative and molecular data in genetic evaluations for milk production of dual-purpose Belgian Blue Cows
Gengler, Nicolas ULiege; Mayeres, P.; Baudouin, J. et al

in Interbull Bulletin (2004), 32

Detailed reference viewed: 3 (0 ULiège)
Full Text
Peer Reviewed
See detailCombined uses of supervised classification and Normalized Difference Vegetation Index techniques to monitor land degradation in the Saloum saline estuary system
Dieng, Ndeye Maguette ULiege; Dinis, Joel; Faye, Serigne et al

in Diop, Salif; Barusseau, Jean-Paul; Descamp, Cyr (Eds.) The Land/Ocean Interactions in the Coastal Zone of West and Central Africa, Estuaries of the World (2014)

Saltwater contamination constitutes a serious problem in Saloumestuary, due to the intermittent and reverse tide flows of the Saloum River. This phenomenon is caused by the runoff deficit, which forces ... [more ▼]

Saltwater contamination constitutes a serious problem in Saloumestuary, due to the intermittent and reverse tide flows of the Saloum River. This phenomenon is caused by the runoff deficit, which forces the advance of saltwater 60 km upstream, contaminating surface water and thus causing the degradation of biodiversity and large areas of agricultural soils in this region. The present study aims to evaluate the consequences of saltwater contamination in the last three decades in this estuary by assessing the land-cover dynamics. Thus, latter consists of tracking the landscape-changing process over time to identify land-cover transitions. These transitions are closely related to the ecosystem-setting condition and can be used to assess the combined impacts of both natural and human-induced phenomena over a given period of time. In this study, special attention was given to mangrove degradation and to temporal progression of the salty barren soils locally called ‘‘tan’’. The loss of mangrove areas to tan and the general increase in salty barren soil areas can reflect the increase in the level of salinization in the study area over the time period under consideration. To fulfill this objective, four Landsat satellite images from the same season in the years 1984, 1992, 1999, and 2010 were used to infer time series land-use and land-cover maps of the Saloumestuary area. In addition to satellite imagery, rainfall records were used to evaluate climatic variation in terms of high-to-low precipitation during the time span considered. Spectral analysis indicated that from 1984 to 2010, mangroves and savanna/ rain-fed agriculture are converted to ‘‘tan’’ (denuded and salty soils). In addition, these results showed that significant changes in land use/land cover occur within the whole estuary system and reflecting therefore environmental degradation, such as land desertification and salinization, and vegetation degradation which reflect the advanced of salinity [less ▲]

Detailed reference viewed: 76 (16 ULiège)
Full Text
Peer Reviewed
See detailCombined utilization of DGTs and bioindicators to trace chemical contamination threats on coastal ecosystems
Richir, Jonathan ULiege; Luy, Nicolas; Serpe, Pierre et al

Conference (2011, May 03)

Trace metal monitoring in marine organisms and their living habitats permit to trace chronic or acute contaminations of marine ecosystems due to human activities. While dissolved trace metal ... [more ▼]

Trace metal monitoring in marine organisms and their living habitats permit to trace chronic or acute contaminations of marine ecosystems due to human activities. While dissolved trace metal concentrations give us an overall and punctual view over biota contamination status, bioindicator species put their bioavailable and possible toxic fraction in an obvious. However, difficulties mainly inherent to metal measurements in seawater lead field ecotoxicologists to study marine pollution essentially through the use of bioindicators alone. The technique of diffusive gradients in thin films (DGT) for the measurement of trace metals in aqueous solutions was introduced in the mid-ninetieth by Davison and Zhang. This passive probe accumulates labile trace metal species in proportion to their bulk environmental concentrations by maintaining a negative gradient between the environment and an ion-exchange resin (Chelex). DGTs average natural water trace metal concentrations over the deployment period, concentrate them and avoid matrix interferences, notably due to dissolved salts in seawater. Their deployment in passive and experimental monitoring studies permits to reliably measure labile trace metal concentrations and, jointly analysed with bioindicators, to estimate their bioavailability to marine organisms. This combined approach DGT-bioindicator was investigated in Calvi Bay (Corsica) through three monitoring studies. (1) DGTs were deployed in Posidonia oceanica bed, a Mediterranean seagrass forming dense meadows from the surface down to 40 meters depth, to study seasonal, spatial and bathymetrical variations of labile trace metal concentrations within this meadow. These concentrations were analysed jointly with Posidonia trace metal contents in order to quantify their bioaccumulation towards this primary producer, taking into account the seagrass biological cycle. (2) Portions of Posidonia meadow were also in situ experimentally contaminated with a mix of dissolved metals to study seagrass kinetics of pollutant accumulation and decontamination. Thanks to DGTs deployed inside contaminated mesocosms throughout experiments, Posidonia responses to known metal concentrations could be precisely quantified. (3) The blue mussel Mytilus galloprovincialis is widely used in trace metal monitoring programs. Mussels, stored in conchylicultural pouches, were transplanted for 3 months in contrasted stations of Calvi Bay (e.g. aquaculture farm, sewer, etc.) in parallel with DGTs. Such as for Posidonia, the complementary utilization of DGTs and mussels permitted to describe water contamination levels at the scale of the Bay, and their bioaccumulation towards mussels. These 3 studies demonstrate the usefulness of DGTs to monitor labile trace metals in an ecological and ecosystemic approach, in parallel with marine organisms, both indicators furnishing different and complementary informations about ecosystem functioning. [less ▲]

Detailed reference viewed: 65 (14 ULiège)
Full Text
Peer Reviewed
See detailCombiner les mesures métaboliques cérébrales et neuropsychologiques permet une meilleure prédiction de la conversion vers une maladie d’Alzheimer chez les patients MCI
Bastin, Christine ULiege; Adam, Stéphane ULiege; LEKEU, Françoise ULiege et al

in Revue Neurologique (2009), 165

Introduction. Une voie de recherche neurologique importante concerne la capacité de prédire sur base de l’évaluation initiale des patients avec Mild Cognitive Impairment (MCI) ceux qui vont développer une ... [more ▼]

Introduction. Une voie de recherche neurologique importante concerne la capacité de prédire sur base de l’évaluation initiale des patients avec Mild Cognitive Impairment (MCI) ceux qui vont développer une maladie d’Alzheimer (MA). Parmi les tests neuropsychologiques, le rappel indicé avec indiçage congruent lors de l’encodage et du rappel (RI48) apparaît comme le meilleur prédicteur du devenir des patients MCI (Ivanoiu et al., 2005). D’autre part, on a montré que les mesures métaboliques cérébrales (TEP-FDG), plus particulièrement l’hypométabolisme du cortex temporopariétal, prédit le déclin cognitif global dans le MCI mieux que des mesures neuropsychologiques (Chételat et al., 2005). Le but de notre étude était d’évaluer le pouvoir de prédiction pour la conversion du MCI vers une MA de deux prédicteurs robustes (performance au RI48 et métabolisme cérébral) pris soit isolément soit ensemble. Méthode. 50 patients MCI ont subi un examen en TEP-FDG au repos et ont réalisé le test de rappel indicé RI48 et le MMSE. Au terme d’un suivi neuropsychologique de 36 mois, 28 patients ont évolué vers une MA et 22 sont restés stables. Le métabolisme cérébral et les performances cognitives ont été comparés entre « convertisseurs » et MCI-stables. Des analyses discriminantes ont ensuite permis d’évaluer la capacité de classification de l’âge, du MMSE et des mesures métaboliques et mnésiques considérés individuellement ou selon diverses combinaisons. Résultat. Par comparaison avec les MCI-stables, les « convertisseurs » montraient un hypométabolisme du cortex temporal moyen bilatéralement, du cortex pariétal inférieur droit et du précuneus droit, et de plus faibles performances initiales au RI48. Prises individuellement, les différentes mesures permettaient le même taux de classification correcte (métabolisme cérébral = 76%, RI48 = 76%). L’âge et le MMSE étaient de faibles prédicteurs (exactitude de classification = 62% et 66% respectivement). Par contre, la combinaison des mesures métaboliques et des scores au RI48 prédisaient le mieux la progression vers la MA (88%). Conclusion. Les résultats suggèrent que la stratégie optimale pour identifier quels patients MCI ont plus de risque de développer une MA est de combiner les mesures métaboliques cérébrales et la performance à un test de mémoire très sensible. [less ▲]

Detailed reference viewed: 111 (10 ULiège)
Full Text
See detailCombiner sport et études. Comment cela est-il vécu ?
Cloes, Marc ULiege

Scientific conference (2004, October 16)

Detailed reference viewed: 81 (5 ULiège)
Full Text
Peer Reviewed
See detailCombining a stability and a performance-oriented control in power systems
Glavic, M.; Ernst, Damien ULiege; Wehenkel, Louis ULiege

in IEEE Transactions on Power Systems (2005), 20(1), 525-526

This paper suggests that the appropriate combination of a stability-oriented and a performance-oriented control technique is a promising way to implement advanced control schemes in power systems. The ... [more ▼]

This paper suggests that the appropriate combination of a stability-oriented and a performance-oriented control technique is a promising way to implement advanced control schemes in power systems. The particular approach considered combines control Lyapunov functions (CLF) and reinforcement learning. The capabilities of the resulting controller are illustrated on a control problem involving a thyristor-controlled series capacitor (TCSC) device for damping oscillations in a four-machine power system. [less ▲]

Detailed reference viewed: 33 (1 ULiège)
Full Text
Peer Reviewed
See detailCombining acceleration techniques for pricing in a VRP with time windows
Michelini, Stefano ULiege; Arda, Yasemin ULiege; Küçükaydin, Hande

Conference (2016, January 28)

In this study, we investigate a solution methodology for a variant of the VRP with time windows. The cost of each route depends on its overall duration (including waiting times), while the departure time ... [more ▼]

In this study, we investigate a solution methodology for a variant of the VRP with time windows. The cost of each route depends on its overall duration (including waiting times), while the departure time of a vehicle is a decision variable. Furthermore, each route has a maximum permitted duration. In order to solve this problem with a branch-and-price methodology, we study also the associated pricing problem, an elementary shortest path problem with resource constraints (ESPPRC). Compared to the classical ESPPRC, this variant admits an infinite number of Pareto-optimal states. In order to tackle this, it was shown in [1] that it is possible to represent the total travelling time as a piecewise linear function of the service start time at the depot. Together with this representation, an appropriate label structure and domi- nance rules are proposed and integrated into an exact bidirectional dynamic programming algorithm [2]. It is possible to implement certain acceleration techniques in the dynamic program- ming algorithm used to solve the pricing problem. We focus on two of these techniques: decremental state space relaxation (DSSR), introduced in [3], and ng-route relaxation, in- troduced in [4] and [5]. DSSR aims to enforce gradually the constraints on the elementarity of the path, which adversely affect the number of generated and dominated labels. A set of critical nodes is iteratively populated, and elementarity is enforced only on these critical nodes. When using ng-route relaxation, a neighbourhood is defined for each vertex. Then, the labels are extended such that, thanks to this neighbourhood structure, it is possible to allow only cycles that are relatively expensive and therefore less likely to appear in the optimal solution. In this study, we explore several different strategies used to apply these techniques, for example initialization strategies for the critical vertex set in DSSR, or the size of the neighbourhoods for ng-route relaxation. We also analyze two ways of combining DSSR and ng-route relaxation. The different algorithmic choices are represented as categorical parameters. The categorical parameters, together with the numerical ones, can be tuned with tools for automatic algorithm configuration such as the irace package [6]. We discuss how this column generation procedure can be included as a component in the development of a matheuristic based on the idea in [7], which consists in a collaboration scheme between a branch-and-price algorithm, an exact MIP solver, and a metaheuristic. [less ▲]

Detailed reference viewed: 48 (6 ULiège)
Full Text
Peer Reviewed
See detailCombining Active Learning and Reactive Control for Robot Grasping
Kroemer, Oliver; Detry, Renaud ULiege; Piater, Justus ULiege et al

in Robotics and Autonomous Systems (2010)

Detailed reference viewed: 90 (4 ULiège)
Full Text
Peer Reviewed
See detailCombining an original method for preserving RNA expression in situ with an effetive RNA method makes it possible to study gene expression in any banana fruit tissue.
Lassois, Ludivine ULiege; de Lapeyre de Bellaire, Luc; Jijakli, Haissam ULiege

in Fruits (2009), 64(3), 127-137

Introduction. RNA isolation is a prerequisite to studying gene expression in banana and to understanding changes occurring in response to the environment. Standard extraction methods do not efficiently ... [more ▼]

Introduction. RNA isolation is a prerequisite to studying gene expression in banana and to understanding changes occurring in response to the environment. Standard extraction methods do not efficiently extract RNA from plants such as banana, with high levels of phenolics, carbohydrates, or other compounds that bind to and/or coprecipitate with RNA. Materials and methods. Five to seven RNA extraction methods were compared. Four crowntissue storage methods were also compared. cDNA-AFLP was used to ensure that the obtained RNA was of sufficient quality for molecular applications and that RNA expression was unaltered by in situ storage. Results and discussion. The modified hot-borate method proved to be the best RNA extraction method, allowing high yields of good quality, undegraded RNA from the crown, fruit peel and pulp at all stages of ripening. The RNA obtained by this method was of sufficient quality for molecular applications such as cDNA-AFLP that give highly reproducible results. Freeze-drying of fresh tissues and tissue conservation in hot-borate buffer, two original storage methods, appear appropriate for preserving RNA in situ without ultra-low temperature. The RNA obtained was of high quality, undegraded, and useful for all downstream applications. The genome expression profile obtained by cDNA-AFLP analysis was unaltered by these methods for storing collected tissues. Conclusion. By applying all the suggested procedures in this work, it is possible to store and study gene expression in any banana fruit tissue, whatever the maturity stage, without affecting the RNA expression level. [less ▲]

Detailed reference viewed: 49 (6 ULiège)
Full Text
See detailA Combining Approach to Cover Song Identification
Osmalsky, Julien ULiege

Doctoral thesis (2017)

This thesis is concerned with the problem of determining whether two songs are different versions of each other. This problem is known as the problem of cover song identification, which is a challenging ... [more ▼]

This thesis is concerned with the problem of determining whether two songs are different versions of each other. This problem is known as the problem of cover song identification, which is a challenging task, as different versions of the same song can differ in terms of pitch, tempo, voicing, instrumentation, structure, etc. Our approach differs from existing methods, by considering as much information as possible to identify cover songs. More precisely, we consider audio features spanning multiple musical facets, such as the tempo, the duration, the harmonic progression, the musical structure, the relative evolution of timbre, among others. In order to do that, we evaluate several state-of-the-art systems on a common database, containing 12,856 songs, that is a subset of the Second Hand Song dataset. In addition to evaluating existing systems, we introduce our own methods, based on global features, and making use of supervised machine learning algorithms to build a similarity model. For evaluating and comparing the performance of 10 cover song identification systems, we propose a new intuitive evaluation space, based on the notions of pruning and loss. Our evaluation space allows to represent the performance of the selected systems in a two dimensional space. We further demonstrate that it is compatible with standard metrics, such as the mean rank, the mean reciprocal rank and the mean average precision. Using our evaluation space, we present a comparative analysis of 10 systems. The results show that few systems are usable in a commercial system, as the most efficient is able to identify a match at the first position for 39% of the analyzed queries, which corresponds to 4,965 songs. In addition, we evaluate the systems when they are pushed to their limits, by analyzing how they perform when the audio signal is strongly degraded. To improve the identification rate, we investigate ways of combining 10 systems. We evaluate rank aggregation methods, that aim at aggregating ordered lists of similarity results, to produce a new, better ordering of the database. We demonstrate that such methods produce improved results, especially for early pruning applications. In addition to evaluating rank aggregation techniques, we propose to study combination through probabilistic rules. As the 10 selected systems do not all produce probabilities of similarity, we investigate calibration techniques to map scores to relevant posterior probability estimates. After the calibration process, we evaluate several probabilistic rules, such as the product, the sum, and the median rule. We further demonstrate that a subset of the 10 initial systems produces better performance than the full set, thus showing that some systems are not relevant to the final combination. Applying a probabilistic product rule to a subset of systems significantly outperforms any individual systems, on the considered database. In terms of direct identification (top-1), we achieve an improvement of 10% (5,460 tracks identified), and in terms of mean rank, mean reciprocal rank and mean average precision, we respectively improve the performance by 40%, 9.5%, and 12.5%, with respect to the previous state-of-the-art performance. We further implement our final combination in a practical application, named DISCover, giving the possibility for a user to select a query and listen to the produced list of results. While a cover is not systematically identified, the produced list of songs is often musically similar to the query. [less ▲]

Detailed reference viewed: 100 (8 ULiège)
Full Text
Peer Reviewed
See detailCombining classification techniques with Kalman filters for aircraft engine diagnostics
Dewallef, Pierre ULiege; Romessis, C.; Léonard, Olivier ULiege et al

in Journal of Engineering for Gas Turbines & Power (2006), 128(2), 281-287

A diagnostic method consisting of a combination of Kalman filters and Bayesian Belief Network (BBN) is presented. A soft-constrained Kalman filter uses a priori information derived by a BBN at each time ... [more ▼]

A diagnostic method consisting of a combination of Kalman filters and Bayesian Belief Network (BBN) is presented. A soft-constrained Kalman filter uses a priori information derived by a BBN at each time step, to derive estimations of the unknown health parameters. The resulting algorithm hers improved identification capability in comparison to the stand-alone Kalman filter. The paper focuses on a way of combining the information produced by the BBN with the Kalman filter. An extensive set of fault cases is used to test the method on a typical civil turbofan layout. The effectiveness of the method is thus demonstrated, and its advantages over individual constituent methods are presented. [less ▲]

Detailed reference viewed: 106 (7 ULiège)
Full Text
Peer Reviewed
See detailCombining classification techniques with Kalman filters for aircraft engine diagnostics
Dewallef, Pierre ULiege; Léonard, Olivier ULiege; Mathioudakis, Kostas et al

in Proceedings of the ASME Turbo Expo 2004 (2004, June)

Detailed reference viewed: 22 (3 ULiège)
Full Text
Peer Reviewed
See detailCombining Color, Depth, and Motion for Video Segmentation
Leens, Jérôme ULiege; Pierard, Sébastien ULiege; Barnich, Olivier et al

in Computer Vision Systems (2009)

This paper presents an innovative method to interpret the content of a video scene using a depth camera. Cameras that provide distance instead of color information are part of a promising young technology ... [more ▼]

This paper presents an innovative method to interpret the content of a video scene using a depth camera. Cameras that provide distance instead of color information are part of a promising young technology but they come with many diff culties: noisy signals, small resolution, and ambiguities, to cite a few. By taking advantage of the robustness to noise of a recent background subtraction algorithm, our method is able to extract useful information from the depth signals. We further enhance the robustness of the algorithm by combining this information with that of an RGB camera. In our experiments, we demonstrate this increased robustness and conclude by showing a practical example of an immersive application taking advantage of our algorithm. [less ▲]

Detailed reference viewed: 253 (34 ULiège)
Full Text
Peer Reviewed
See detailCombining Coronagraphy with Interferometry as a Tool for Measuring Stellar Diameters
Riaud, P.; Hanot, Charles ULiege

in Astrophysical Journal (2010)

The classical approach for determining stellar angular diameters is to use interferometry and to measure fringe visibilities. Indeed, in the case of a source having a diameter larger than typically λ/6B ... [more ▼]

The classical approach for determining stellar angular diameters is to use interferometry and to measure fringe visibilities. Indeed, in the case of a source having a diameter larger than typically λ/6B, B being the interferometer's baseline and λ the wavelength of observation, the fringe contrast decreases. Similarly, it is possible to perform angular diameter determinations by measuring the stellar leakage from a coronagraphic device or a nulling interferometer. However, all coronagraphic devices (including those using nulling interferometry) are very sensitive to pointing errors and to the size of the source, two factors with significant impact on the rejection efficiency. In this work, we present an innovative idea for measuring stellar diameter variations, combining coronagraphy together with interferometry. We demonstrate that, using coronagraphic nulling statistics, it is possible to measure such variations for angular diameters down to ≈λ/40B with 1σ error-bars as low as ≈λ/1500B. For that purpose, we use a coronagraphic implementation on a two-aperture interferometer, a configuration that significantly increases the precision of stellar diameter measurements. Such a design offers large possibilities regarding the stellar diameter measurement of Cepheids or Mira stars, at a 60-80 μas level. We report on a simulation of a measurement applied to a typical Cepheid case, using the VLTI-UT interferometer on Paranal. [less ▲]

Detailed reference viewed: 13 (2 ULiège)
Full Text
See detailCombining ecotope segmentation and remote sensing data for biotope and species distribution modelling
Coos, William ULiege; Delangre, Jessica ULiege; Radoux, Julien et al

Poster (2016, April 29)

The design of appropriate biodiversity conservation actions requires an extensive knowledge of biotope and species distributions. Biodiversity monitoring is often a time-consuming task; however, it can be ... [more ▼]

The design of appropriate biodiversity conservation actions requires an extensive knowledge of biotope and species distributions. Biodiversity monitoring is often a time-consuming task; however, it can be optimised by biotope and species distribution models. In the Lifewatch project, a database combining segmentation in homogeneous landscape units (“ecotopes”) and environmental attributes derived from regularly updated remote sensing data (land cover, topography, potential solar energy,…) and other data sources (climate and edaphic factors) has been designed. Our aim was to assess the usefulness of this database for biotope and species distribution modelling. As a case study, the distributions of a peatbogs (actual and potential) and of a peatbog specialist butterfly (the cranberry fritillary Boloria aquilonaris (Stichel, 1908)) were independently modelled, using the Random Forest algorithm. The agreement between the biotope and species distribution models was assessed. Our map of predictions was compared to a model derived from a more classical grid-based approach. We observed that ecotope segmentation fitted more closely objective limits on the field, thereby improving the efficiency of biodiversity monitoring. The comparison between actual and potential biotopes allowed us to identify potential restoration areas. [less ▲]

Detailed reference viewed: 27 (4 ULiège)
Full Text
Peer Reviewed
See detailCombining feature extraction methods to assist the diagnosis of Alzheimer's disease
Segovia, Fermin; Górriz, J. M.; Ramírez, J. et al

in Current Alzheimer Research (2016), 13

Neuroimaging data as 18F-FDG PET is widely used to assist the diagnosis of Alzheimer’s disease (AD). Looking for regions with hypoperfusion/ hypometabolism, clinicians may predict or corroborate the ... [more ▼]

Neuroimaging data as 18F-FDG PET is widely used to assist the diagnosis of Alzheimer’s disease (AD). Looking for regions with hypoperfusion/ hypometabolism, clinicians may predict or corroborate the diagnosis of the patients. Modern computer aided diagnosis (CAD) systems based on the statistical analysis of whole neuroimages are more accurate than classical systems based on quantifying the uptake of some predefined regions of interests (ROIs). In addition, these new systems allow determining new ROIs and take advantage of the huge amount of information comprised in neuroimaging data. A major branch of modern CAD systems for AD is based on multivariate techniques, which analyse a neuroimage as a whole, considering not only the voxel intensities but also the relations among them. In order to deal with the vast dimensionality of the data, a number of feature extraction methods have been successfully applied. In this work, we propose a CAD system based on the combination of several feature extraction techniques. First, some commonly used feature extraction methods based on the analysis of the variance (as principal component analysis), on the factorization of the data (as non-negative matrix factorization) and on classical magnitudes (as Haralick features) were simultaneously applied to the original data. These feature sets were then combined by means of two different combination approaches: i) using a single classifier and a multiple kernel learning approach and ii) using an ensemble of classifier and selecting the final decision by majority voting. The proposed approach was evaluated using a labelled neuroimaging database along with a cross validation scheme. As conclusion, the proposed CAD system performed better than approaches using only one feature extraction technique. We also provide a fair comparison (using the same database). [less ▲]

Detailed reference viewed: 96 (30 ULiège)
Full Text
Peer Reviewed
See detailCombining Features for Cover Song Identification
Osmalsky, Julien ULiege; Embrechts, Jean-Jacques ULiege; Foster, Peter et al

in 16th International Society for Music Information Retrieval Conference (2015, October)

In this paper, we evaluate a set of methods for combining features for cover song identification. We first create multiple classifiers based on global tempo, duration, loudness, beats and chroma average ... [more ▼]

In this paper, we evaluate a set of methods for combining features for cover song identification. We first create multiple classifiers based on global tempo, duration, loudness, beats and chroma average features, training a random forest for each feature. Subsequently, we evaluate standard combination rules for merging these single classifiers into a composite classifier based on global features. We further obtain two higher level classifiers based on chroma features: one based on comparing histograms of quantized chroma features, and a second one based on computing cross-correlations between sequences of chroma features, to account for temporal information. For combining the latter chroma-based classifiers with the composite classifier based on global features, we use standard rank aggregation methods adapted from the information retrieval literature. We evaluate performance with the Second Hand Song dataset, where we quantify performance using multiple statistics. We observe that each combination rule outperforms single methods in terms of the total number of identified queries. Experiments with rank aggregation me- thods show an increase of up to 23.5 % of the number of identified queries, compared to single classifiers. [less ▲]

Detailed reference viewed: 156 (32 ULiège)