• Mechatronic design of a fast-non-contact measurement system for inspection of castings parts in production line

      Ahuett-Garza, Horacio; Kurfess, Thomas R.; Guamán-Lozada, Darío F.; Urbina Coronado, Pedro Daniel; Orta Castañon, Pedro (Instituto Tecnológico y de Estudios Superiores de Monterrey, 2019-05-11)
      Product recalls for suppliers (Tier 1-2-3) and OEM represents high financial losses and reputation damage. This has motivated manufacturers to inspect 100% of the specifications of 100% parts produced to avoid liability risks. In general, the manufactured parts are measured in CMM machines, the main problem is that it takes a long time to make the measurement. Therefore, CMM machines cannot be installed in a continuous line process. This problem has led industries to install gauging machines to have full control over their production. Gauging machines are not flexible, a number of sensors equal to the number of targets to be inspected is needed, complicating the maintenance and increasing the cost. Finally, most gauges are of the go-no go type, which only validates whether the characteristics comply with a standard. In addition, due to the arrival of the concept of industry 4.0, companies have seen the need to develop fast, reliable and accurate inspection machines capable of sending proper information about themselves or the product to the cloud. This work presents a new measurement system for an In-Line die-casting process. The main characteristic is the use of a linear motor and non-contact measurement technology for fast and reliable measurements. Also, the machine uses a novel kinematics coupling configuration to allow easy, fast, and accurate positioning of the part in the measurements area.  To be compatible with Industry 4.0 the inspection machine is equipped with sensors to send process information to the cloud like operation temperature, vibrations, and dynamic machine behavior.
    • Wavelets for spindle health diagnosis

      Morales Menéndez, Rubén; Villagómes Garzón, Silvia Cristina; Vallejo Guevara, Antonio; Hernández Alcántara, Diana (2018-12-04)
      Industrial development and customer demands have increased the need to look for high-quality products at low cost and, at the same time, ensure safety during manufacturing. As a result, rotary machinery and its components have become increasingly complex, making their repairs more expensive. Therefore, many efforts must be focused in preventing breakdowns in machines, for which real-time fault diagnosis and prognosis are mandatory. Considering that the element most prone to failure in a machining center is the spindle, and with it its bearing system, the diagnosis of failures of these elements is of paramount importance. To ensure the safe operation of the bearing, some methods of fault detection have been developed based on different techniques. One of the most commonly used is vibration analysis. There are several difficulties when dealing with analyzing vibration signals, they are complex and non-stationary signals with a large amount of noise. Conventional analysis have not been able to solve this problem, thus, alternative methods such as Wavelet Transform have been gaining ground. The following research is focused in detecting bearing faults, as well as the main shaft faults, which eventually also lead to bearing damage, by using wavelets. Different signals, presenting distinct bearing fault conditions, of different data sets are evaluated for validating the proposed methodology. An exhaustive analysis has been developed for selecting the best parameters of this methodology. As results, an improvement around 20% in magnitude of bearing fault frequency peaks was found, compared to the traditional methodology. The proposal of giving more weight to high energy components allows increasing these fault frequencies, as well as reducing low frequency noise. This provides a great advantage in pursuit of an automatic fault detection. An industrial approach was also validated, by proving that the proposed methodology is more immune to noise. Even though, the magnitudes of the bearing fault peaks are diminished by noise, a comparison between the proposal and the traditional methodology reveal an increase of approximately 70% of those magnitudes. Demonstrating that the fault information is barely attenuated by noise. Also, an early diagnosis was proved, which could benefit future studies of fault prognosis. Finally, the filtering property of wavelet decomposition is exploited to limit the frequencies of the signal to few harmonics of the shaft speed. This with the aim of restricting the spectrum for detecting other faults, that mainly affect the spindle shaft, which are diagnosed by analyzing speed harmonics and subharmonics. Thus, a complete methodology is proposed to deal with the main spindle faults.
    • Implementación de tabla comparativa de parámetros operativos de los generadores eléctricos de las unidades de la c.t. manzanillo que permita la toma de decisiones para reducir las pérdidas económicas por generación de potencia reactiva.

      Martell Chávez. Fernando; Meléndez García, Sergio Javier; Nuñez Borbón, Jacobo; Jiménez Arellano, Rodrigo; Rosales Martín, Alfonso; Martell Chávez. Fernando (Instituto Tecnológico y de Estudios Superiores de Monterrey, 2018-11-12)
      La C.T. Gral. Manuel Álvarez Moreno cuenta con dos módulos de Ciclo Combinado, esta tecnología utiliza el gas natural como combustible utilizando el Ciclo Brayton para la conversión de la energía química del combustible en energía térmica para producir vapor y finalmente producir energía eléctrica a la salida de los generadores eléctricos. Las turbinas de gas y de vapor del Ciclo Combinado están acopladas a un generador eléctrico que por medio de la interacción de campos eléctricos generan energía eléctrica. Los generadores eléctricos están conectados al Sistema Interconectado Nacional (SIN) a través de una subestación eléctrica. Los generadores son elementos que permiten corregir el factor de potencia que presenta el SIN, de tal modo que cuando el sistema se comporta como inductivo, el generador compensa elevando el voltaje aportando reactancia capacitiva al sistema y viceversa. En ocasiones, el personal de la gerencia de control solicita el incremento de la potencia reactiva de un generador sin considerar que otro generador de la misma central está consumiendo reactivos, lo que implica una afectación económica a la Central debido a que esa energía no genera ingreso económico y si un incremento en los costos de operación. Con la implementación del proyecto, se pretende llevar un registro (toma de lectura) para identificar los momentos en que se genere potencia reactiva positiva y negativa para corregir y evitar que un generador esté aportando una potencia reactiva que otro de la misma central le esté consumiendo. El objetivo es reducir las pérdidas de energía en el proceso de generación (producción), derivadas de ajustes diferenciales de potencia reactiva entre las unidades generadoras de la C.T. Gral. Manuel Álvarez Moreno (Manzanillo), evitando los costos derivados de la producción de no menos de 50.00 MVAR-h por semana. El proyecto requiere de una inversión inicial de $ 304,925.08 (trescientos cuatro mil novecientos veinticinco pesos 08/100 M.N.) y proporcionará un ahorro mensual de $49,800.00 (cuarenta y nueve mil ochocientos pesos 00/100 M.N.) por lo que en unos cuantos meses (siete) la inversión habrá sido recuperada por lo que consideramos este proyecto completamente viable para su implementación y seguimiento. Consideramos, que la implementación de la tabla comparativa traerá un beneficio implícito importante al sentar un precedente para que el personal encargado de la operación de las Unidades de generación sea consciente del contexto actual que vivimos como Empresa Productiva Subsidiaria ante la Reforma Energética y direccione sus esfuerzos a minimizar las pérdidas de energía e insumos dentro del ámbito de su competencia. Este proyecto de ser aceptado en la Central, podrá ser implementado en más Centrales Generadoras, incrementando los beneficios de su aplicación.
    • Estudio de factibilidad técnico-económica para la sustitución de hornos de secado comercial con hornos de secado solar

      Martell Chávez. Fernando; Salazar-Calderón, Alejandro; Favela-Coghlan, Fernando; Vargas-Martínez, Mario (Instituto Tecnológico y de Estudios Superiores de Monterrey, 2018-11-12)
      Análisis técnico-económico para la sustitución de hornos de deshidratado eléctricos por un horno de deshidratado con concentrador solar plano de tubos al vacio, para chiles de diversas especies, en la ciudad de Tula de Allende, Hidalgo
    • Formulación de proyecto de aumento de potencia de la Central Hidroeléctrica Boquilla ubicada en el municipio de San Francisco de Conchos, Chihuahua.

      Torres Puente, Eduardo Francisco; Martell Chávez. Fernando; Bejarano-Raygoza, Ramiro; Calderón-Martínez, Alfredo; García-Burrola, Jesús Manuel; Melendez García, Sergio Javier (Instituto Tecnológico y de Estudios Superiores de Monterrey, 2018-11-12)
      El presente proyecto pretende incrementar la capacidad de generación de energía eléctrica de la planta hidroeléctrica Boquilla por lo menos en un 10% a través de la modernización de los equipos instalados actualmente, se plantea un plazo de ejecución de 3 años una vez autorizado el proyecto por el consejo de administración de la CFE, con el fin de maximizar el aprovechamiento de agua no turbinada. Se realiza una descripción de las medidas de eficiencia energética a aplicar y el plan de medición y verificación.
    • Implementación de recubrimientos en cavidades para die casting

      Gümes Castorena, David; López Miranda, Adán; Martínez Jaime, Ricardo; López Miranda, Adán; López Soriano, Eduardo (Instituto Tecnológico y de Estudios Superiores de Monterrey, 2018-09-17)
      La industria manufacturera de partes automotrices se encuentra en constante movimiento y siempre busca minimizar sus costos de operación debido a la alta competitividad que existe a nivel global. Denso cuenta con una división de inyección de aluminio donde una gran parte de los costos operativos se llevan en el mantenimiento y manufactura de los herramentales para los moldes. El propósito de este proyecto es disminuir los costos en el área de moldes. Es normal que durante la manufactura de piezas de aluminio los insertos y cavidades vayan sufriendo daños, como erosión y fracturas. Cuando una cavidad sufre un daño debe ser sometida a un proceso de reparación el cual consta de soldar material nuevo en la zona dañada para posteriormente volver a maquinar la forma deseada. El aspecto negativo de esta reparación es que la pieza no tiene las características de originales, por lo cual la probabilidad de que la pieza vuelva a fracturarse se incrementa, provocando paros no deseados durante la producción y costos de reparación. Durante los últimos años se han introducido nuevas tecnologías que ayuden a prevenir el desgaste y fractura de las cavidades e insertos de los moldes. El objetivo es proteger el metal base de cualquier daño. Para lograr esto se coloca una capa protectora en la superficie de la cavidad la cual se irá desgastando hasta quedar expuesto el acero nuevamente. La estrategia es volver a recubrir las cavidades cada vez que se detecte un nivel bajo de recubrimiento en alguna de las zonas para evitar el desgaste del metal base. El propósito de este estudio es presentar evidencia suficiente que demuestre que la estrategia seleccionada es factible y podrá dar los resultados esperados. Se espera que este tipo de tecnología ayudará a reducir por lo menos un cuarenta por ciento de los costos relacionados con los herramentales de los moldes.
    • Implementación de un Sistema de Gestión de Calidad con base en la Norma ISO 9001:2015

      López Miranda, Adán; Félix Trujillo, Agustín; García Bayardo, Emmanuel (Instituto Tecnológico y de Estudios Superiores de Monterrey, 2018-09-08)
      Este texto muestra un resumen del un proyecto de titulación de la Maestría en Gestión de la Ingeniería del Tecnológico de Monterrey. El proyecto consiste en la Implementación de un Sistema de Gestión de Calidad con base en el estándar ISO 9001:2015 utilizando las herramientas y conceptos de la Administración de Proyectos buscando una ejecución efectiva del mismo.  
    • Diseño de modelo de gestión de la calidad enfocado al tercer sector

      Olivares Olivares, Silvia Lizett; López Mendoza, Diana Anai; Cantú Delgado, José Humberto; García Justicia, Javier José (Instituto Tecnológico y de Estudios Superiores de Monterrey, 2018-05-29)
      Sin lugar a duda, el Tercer Sector o mejor conocido como “Sector social” ha tenido un impacto importante en nuestro país en los últimos años, esto debido al aumento en el número de instituciones que se han sumado a diferentes causas y al aumento en el número de personas que trabajan en el sector. Sin embargo, gestionar la calidad y establecer mecanismos que permitan evaluar la transparencia, eficacia y eficiencia en el sector, así como la calidad de vida en los beneficiarios se ha convertido en uno de los aspectos más complicados de lograr. Por ello, la presente investigación tuvo como objetivo valorar el nivel de la calidad en que se encuentra el Tercer Sector en Nuevo León a través de un Modelo de Gestión de la Calidad apropiado a las mismas en donde se establecieron once aspectos que son relevantes para su establecimiento. Para lograrlo, se llevó a cabo una investigación en 14 organizaciones de Nuevo León tanto Asociaciones Civiles (A.C.) como Asociaciones de Beneficencia Privada (A.B.P.) aplicando el método cuantitativo con un diseño no experimental y de tipo transaccional descriptivo durante en el mes de junio del 2017. Los resultados indican que actualmente, el Tercer Sector en Nuevo León se encuentra ubicado en el tercer nivel de la calidad con una calificación general de 3.09, lo que quiere decir que el sector aún está en el proceso de maduración y establecimiento de características comunes que aseguren el desarrollo, la permanencia y el crecimiento sólido del sector.
    • Immunoaffinity aqueous two-phase systems to establish novel bioprocesses for the primary recovery of CD133+ stem cells

      Rito Palomares, Marco Antonio; Zavala Arcos, Judith; González González, Mirna Alejandra; Ornelas González, Alonso; Rito Palomares, Marco Antonio; Zavala Arcos, Judith; González González, Mirna Alejandra (Instituto Tecnológico y de Estudios Superiores de Monterrey, 2018-05-25)
      A short processing time and efficient scale-up stem cell isolation bioprocess is essential to exploit the potential of these cells for the treatment of multiple chronic diseases. Various methodologies have been used for stem cell recovery, however, most of them present economical and/or time-consuming drawbacks. In this work, the characterization and optimization of immunoaffinity aqueous two-phase systems, a liquid-liquid based separation technology enhanced with the PEGylation of the antibody, was conducted with the aim of increasing the specificity for the recovery of CD133+ stem cells from human umbilical cord blood samples. The methodology consisted in evaluating the partitioning of the different PEGylated antibodies (amine, carboxyl, thiol, succinimidyl ester, methoxy PEG and maleimide) in three previously studied aqueous two-phase systems (ATPS); PEG-dextran (DEX), Ucon-DEX and Ficoll-DEX. Subsequently, an optimization step was accomplished to manipulate the partition behavior of the CD133/2-pure antibody to the desired phase in the selected systems by varying (increasing and decreasing) two parameters closely related with the partitioning of molecules in aqueous two-phase systems; tie-line length (TLL) and volume ratio (VR). Afterwards, the partitioning behavior of the six different PEGylated antibodies in the optimized systems was tested. According to the results, the PEGylation of the CD133/2-biotin antibody induced a favorable change with respect to the non-PEGylated one when Ucon-DEX system was used, fractionating it to both phases. Likewise, the optimization of the systems showed to be effective to induce a change in the partition preference of the antibody. The best results were obtained when Ucon-DEX or PEG-DEX systems with TLL 15% w/w or 20% w/w with VR 3 were combined. Finally, PEGylated antibodies were added to the selected optimized systems. Even though a shift in the fractionation preference of the PEGylated CD133/2-biotin antibody was achieved in the optimized systems, it was not the adequate partition to justify the evaluation of this immunoaffinity ATPS with human umbilical cord samples. Both PEGylation and optimization showed to be effective to induce a change in the partition preference of the antibody, however, further studies are required to find the optimal system composition that will fractionate 100% of the antibody to the contaminants opposite phase, making this system an ideal candidate to be tested for the selectivity of CD133+ stem cells.
    • “Determination of interaction properties between PEGylated proteins and a modified resin by Isothermal Titration Calorimetry (ITC) and FTIR”

      Aguilar-Jiménez, Oscar Alejandro; Magaña, Paulyna; Gonzalez-Valdez, José Guillermo; Ramos de la Peña, Ana Mayela (Instituto Tecnológico y de Estudios Superiores de Monterrey, 2018-05-25)
      PEGylated proteins are an increasing important class of therapeutic drugs due to their improved pharmacokinetic characteristics and solubility over their corresponding native forms. PEGylation is the covalent attachment of one or more polyethylene glycol (PEG) molecules to a protein. Despite the many advantages of PEGylated drugs, one of the major challenges is the purification step after the chemical reaction. The main purpose of this project is to determine the nature of chemical interactions between a modified resin with PEG 5000 g/mol and PEGylated proteins that results in a previously demonstrated ability of such resins for the resolution of PEGylated proteins. A chromatographical separation of PEGylated proteins was additionally demonstrated for lysozyme using the modified resin Sepharose 6B-PEG5000 previously reported for PEGylated RNase A. Fourier Transform Infrared (FTIR) spectroscopy provided insight of the resin modification. The interaction thermodynamics associated with PEGylated proteins in hydrophobic interaction chromatography (HIC) with modified resin was carried out in with an ITC (Isothermal Titration Calorimetry) analysis. The specific enthalpy (∆G) was found to be exothermic for both proteins in potassium phosphate buffer with 1.5 M ammonium sulfate at 25ºC. MonoPEGylated proteins showed large negative entropy (-T∆S) values, related to the enhanced hydrophobic interaction between PEG5000 molecules from the resin and PEGylated protein forms. In addition, binding constants (K) of PEGylated proteins to modified resin were slightly higher compared to unmodified proteins.
    • Genetic transformation of Artemia franciscana by electroporation

      Aguilar-Yañez Jose Manuel; Licona-Cassani, Cuautemoc; Rodríguez-Sánchez, Alberto Constantino; Brunck, Marion E.G. (Instituto Tecnológico y de Estudios Superiores de Monterrey, 2018-05-25)
      Artemia franciscana also known as brine shrimp, and sea monkey is a halophilic crustacean used in aquaculture as living food and had a little role in research as toxicological model. Despite its easy handling and availability, Artemia barely has genetic engineering. The objective of this work is to develop a protocol for the genetic transformation of Artemia franciscana using electroporation and diapause cysts. Decapsulated cysts were electroporated with exponential decay voltage and square wave pulse, the efficiency was calculated. The effect of the plasmid DNA used, and the voltage were evaluated. Square wave pulse shows a better efficiency compared to the exponential. The concentration of DNA has no effect on the efficiency of transfection. Voltage at one level (1000 V·cm-1) had the best efficiency of transfection, but also the worst hatching. With protocol and the information generated, it will be possible to evaluate genetic regulatory elements to the future develop of a tool box for genetic manipulation.
    • Data-Driven approach to topology change location in distribution networks using microPMUs

      Mayo Maldonado, Jonathan Carlos; Salas Esquivel, Ernesto Adán; Valdez Resendiz, Jesús Elías; Micheloud Vernackt, Osvaldo Miguel (Instituto Tecnológico y de Estudios Superiores de Monterrey, 2018-05-24)
      Motivated by the aim to increase the renewable energy penetration into the grid, the Mexican government established the objective of producing the half of its energy from clean sources by 2050. This is also a tendency in the rest of the world, but utilities are not yet prepared to deal with the challenges that the proliferation of this change will bring. A way to solve such issues is by evolving from the antiquated power system model to a smart grid, by building a control and communications infrastructure, and by introducing sensing and metering technologies. In this sense, micro-phasor measurement units (μPMU) are devices tailored for such purpose; but this technology requires specializing research in order to develop tools for its applications on field. Driven by this urgency, we established the objective of building an application based on the μPMU technology. Therefore, in this thesis we propose an algorithm to topology change location in distribution networks using μPMU data; based on a behavioral system theory in which we use any set of variables that are available for measurement within a network. Such approach differentiates from classic methods, since it does not require any information about the network model, and it does not assume any particular character of disturbance to locate the occurrence within the network. MATLAB simulations and experimentation using μPMUs and a DSpace Data Acquisition Card were implemented with satisfactory results, since the algorithm demonstrated to be capable to locate single topology changes in distribution networks.
    • Pipeline evaluation of clustering algorithms aimed at clinical data

      Temez Peña, José G.; Duarte Dyck, David Absalón; Terashima Marín, Hugo; Treviño Alvarado, Víctor M. (Instituto Tecnológico y de Estudios Superiores de Monterrey, 2018-05-22)
      Disease understanding is key in designing effective treatments and diagnostic tools. A key aspect of this understanding is grouping the patients according to their phenotypes. Phenotypes are patterns in the characteristics of certain members of a population that are correlated with a particular illness. This grouping may be useful in revealing associations between disease risk, treatment responses, and other key clinical outcomes. Once these associations are found, it is easier to design tailored diagnosis tools and effective personalized treatments. To achieve this grouping goal, data is key, and recent advancements in digital technology have made possible to capture hundreds and thousands of clinical data that may be used to group patients into different disease phenotypes. To handle hundreds of patients, with hundreds of features, clinical researchers use clustering algorithms that automatically find hiding association between subjects. These algorithms are very useful once the researcher selects the correct clustering and configure it to the specific research task. Selecting the correct clustering algorithm is time-consuming, and setting up their parameters may take several trail and test sessions. On the other hand, computer scientists have developed several clustering metrics that can evaluate the fitness of the clustering algorithms to the data, and computer power has increased, allowing the automated testing and evaluation of the clustering algorithms in the specific data set. The objective of this proposal was the development of an automated computer pipeline that evaluates several clustering algorithms, providing metrics regarding important features such as clustering stability (Jaccard index) and clustering relevance (ANOVA test). Furthermore, the pipeline returns the number of natural clusters that may be useful for the given dataset (Dunn index). The designed pipeline was set up to evaluate the classical clustering algorithms of k-means, Fuzzy C-means, and Hierarchical clustering, but it can be used to test a user-provided clustering method. The evaluation consisted in bootstrapping the data and extracting the Dunn and Jaccard clustering indexes in a meaningful manner. Furthermore, the clinical relevance of the final clusters was evaluated using an ANOVA test, that provided indications of disease phenotypes. All the test results are plotted and the user can visually evaluate the performance of the different clustering methods in their data. The result of this development was deployed in R (github.com/majordave/clustest). The utility of the pipeline was tested on synthetic data sets and two radiomics datasets associated with the development of Osteoarthritis (OA) and the presence of breast cancer from mammograms. Furthermore, we contrasted the closeting approach to supervised learning of a large dataset of the association of nutrition with OA symptoms. Hence, the present work established that the automated robust evaluation of the utility of clustering algorithms in clinical data is feasible, and provided a publicly available software tool that can be used by any clinical researchers to select the best clustering algorithm for their data.
    • Pulsation regimes near static borders of optically injected semiconductor lasers

      Campuzano Treviño, Gabriel; Aldaya Grande, Ivan Artiz; Rodriguez-Martinez, Sergio Luis; Castañón Avila, Gerardo Antonio; Arizpe, Israel de León (Instituto Tecnológico y de Estudios Superiores de Monterrey, 2018-05-21)
      The semiconductor laser injected externally by another laser (master-slave configuration) is a theoretical and experimental tool suitable for the scientific exploration of non-linear effects. Using numerical tools we have found pulsation regimes aside from period 1 and period 2. Techniques such as finding the maximum and minima of a temporal series, ratio between maxima and minima and the total harmonic distortion can be very useful to find and characterize pulsation regimes. Pulsation regimes where found near the borders of the stable synchronization region of the laser. The characterized region has a range of frequency from 150MHz to 2.78GHz with a frequency detuning from 0.98GHz to 3GHz.
    • Design and development of a reconfigurable die for thermoforming process

      Cortés Ramírez, Jorge Armando; Reyna Yáñez, Felipe Osvaldo; Cárdenas Alemán, Eduardo; Aguayo Téllez, Humberto (Instituto Tecnológico y de Estudios Superiores de Monterrey, 2018-05-17)
      According to the World Economic Forum (WEF), the nascent technologies such as Internet of Things (IoT), 3D printing, and advanced robotics represent between 70%-80% of the product market, manufacturing and services in North America and Europe. In this context, the designing and implementation of reconfigurable tools can generate more flexible processes, oriented to improve the productivity in manufacturing industry, especially in molding and thermoforming packaging. Consequently, a reconfigurable die based on shape-memory alloy actuators, is a solution for attending the industry challenges, due to its inclusion of a microstructural-based, temperature-based, and time-based constitutive model theory for its implementation. Within these challenges exists the need of fabrication of millions of pieces due to its limitation with a low-volume production, and high investment and maintenance tool costs. The objective of this thesis work is the development of a technology that allows manufacturing variable-shape packages through a reconfigurable die with a thermoforming process. The functionality and performance of the technology lie within the application of a shape-memory alloy theory for a reconfigurable system; the development of a functional prototype with both NiTi-based and stepper-based motion mechanisms; and the development of a technology roadmap that allows having a vision of the potential customer segment of the product and/or technology, according to their needs and opportunities.
    • Hilbert-Huang transform based methodology for bearing fault detection

      Vallejo Guevara, Antonio Jr.; Morales Menéndez, Rubén; Campos García, Rubén; Ibarra Zárate, David Isaac (Instituto Tecnológico y de Estudios Superiores de Monterrey, 2018-05-16)
      Rotating machinery is of great importance for manufacturing industry, and therefore huge investments for their acquisition are made every year. Machine preservation plays an important role in the exploitation of this resource. Rotating machines are more susceptible to certain types of faults, investigations report that at least 42 % of the root causes of failure in rotating machinery are related with bearings. To detect the bearing condition many techniques have been developed. One of the most reliable is vibration analysis. The Hilbert-Huang transform (HHT) has been used for vibration analysis and has gained attention in recent years, a topic of controversy in this method is the selection of the Intrinsic Mode Functions (IMFs) with fault information. Statistical parameters can be used to describe the characteristics of vibration signals, this attribute can be exploited to select the IMFs. There are many time domain features used for signal analysis. In this research, a study of 17 statistical parameters was made to determine which one is the best to represent IMFs with fault information. As a result of this analysis a new methodology based on HHT is proposed. This methodology deals with the IMF selection with the use of KR (Kurtosis x RMS) to detect the IMFs with fault information, and can be used to detect incipient bearing faults. The proposed methodology was validated with 18 signals from the Case Western Reserve University (CWRU), Tian-Yau Wu, and the society for Machinery Failure Prevention Technology (MFPT Society) databases. For the 18 analyzed signals, only one IMF was wrongly selected. The cause of this error was the end defect produced in the EMD, this caused the KR amplitude to increase even tough the IMF did not have fault information. The results on the Envelope spectrum from 14 signals were clear with fault components with large amplitude. For the remaining four signals the results on the Envelope spectrum was noisy, but the fault fault components were distinguishable.Rotating machinery is of great importance for manufacturing industry, and therefore huge investments for their acquisition are made every year. Machine preservation plays an important role in the exploitation of this resource. Rotating machines are more susceptible to certain types of faults, investigations report that at least 42 % of the root causes of failure in rotating machinery are related with bearings. To detect the bearing condition many techniques have been developed. One of the most reliable is vibration analysis. The Hilbert-Huang transform (HHT) has been used for vibration analysis and has gained attention in recent years, a topic of controversy in this method is the selection of the Intrinsic Mode Functions (IMFs) with fault information. Statistical parameters can be used to describe the characteristics of vibration signals, this attribute can be exploited to select the IMFs. There are many time domain features used for signal analysis. In this research, a study of 17 statistical parameters was made to determine which one is the best to represent IMFs with fault information. As a result of this analysis a new methodology based on HHT is proposed. This methodology deals with the IMF selection with the use of KR (Kurtosis x RMS) to detect the IMFs with fault information, and can be used to detect incipient bearing faults. The proposed methodology was validated with 18 signals from the Case Western Reserve University (CWRU), Tian-Yau Wu, and the society for Machinery Failure Prevention Technology (MFPT Society) databases. For the 18 analyzed signals, only one IMF was wrongly selected. The cause of this error was the end defect produced in the EMD, this caused the KR amplitude to increase even tough the IMF did not have fault information. The results on the Envelope spectrum from 14 signals were clear with fault components with large amplitude. For the remaining four signals the results on the Envelope spectrum was noisy, but the fault fault components were distinguishable.
    • Environmental impact of conventional manufacturing and additive manufacturing in lifecycle of turbine blade

      Rodríguez González, Ciro Ángel; Siller Carrillo, Héctor Rafael; Vila Pastor, Carlos; Torres Carrillo, Sharon Andrea; Vega, Yadira (Instituto Tecnológico y de Estudios Superiores de Monterrey, 2018-05-16)
      The exponential growth of additive manufacturing technologies is not only improving production processes to achieve functional requirements for products, but it could also help to minimize environmental impacts. In order to align a green product lifecycle management vision, companies need to implement emerging technologies and define a set of metrics that measure the benefits of the change. Each product requires a particular and optimized manufacturing process plan, and each production phase must achieve a significant reduction of critical metrics for the whole Life Cycle Assessment (LCA). This study provides a comprehensive and comparative LCA of two manufacturing process plans for the case study of an aircraft engine turbine blade. The first process consists of a combination of Investment Casting and Precision Machining and the second consists in the replacement of Investment casting by Selective Laser Melting as an emergent process for near net shape fabrication. The collected data for the comparison includes Global Warming Potential (GWP), Acidification Potential (AP), Ozone layer Depletion Potential (ODP), Human Toxicity Potential (HTP), Ecotoxicity and Abiotic Depletion Potential (ADP).
    • A circular lean product-service systems design framework: motivations, drivers and constraints

      Romero Díaz, David; González Chávez, Clarissa Alejandra; Rodríguez, Ciro; Vázquez, Elisa (Instituto Tecnológico y de Estudios Superiores de Monterrey, 2018-05-16)
      In the last years, the Service Sector has received increasing attention from both academics and practitioners. The transition from traditional manufacturing to service-integrated systems has given place to the “servitisation revolution”, which today, is a relevant revenue generator not only for companies, but also, for societies. The definition of “Product-Service Systems” calls for value-generation through market expansion, by the addition of competitive advantages to previous companies’ offerings. This solution has been recognized for being one of the most efficient techniques towards the achievement of resource-efficient and sustainable economies. PSS has grown beyond expectations, becoming a common term among publications of the most recognized academic institutions and a highly discussed topic across a broad range of geographically diverse organizations. However, recent literature suggests analysing the compatibility of PSS with other tools, methodologies and principles, which may help enhance the intrinsic environmentally sustainable advantage that belongs in its first definitions; but that has, unfortunately, faded through time. This research attempts to do so; to analyse how Product-Service Systems can benefit from Circular Economy and Lean principles. In this first attempt, non-systemized, although valuable, literature was found. Through this research work, a Systematic Literature Review is developed to identify, through an objective quantitative and qualitative analysis, those tools, principles and methodologies which can modify each stage of a PSS. Furthermore, a first Circular Lean Product-Service System Design Framework is proposed and extensively described. This framework is validated empirically through a Case Study supported by two vessel-building companies. Further research is required to validate the proposed framework among different industries with a higher involvement in the CLPSS design.
    • Metodología para optimización de proceso de preparación de pedidos en rutas de distribución secundaria mediante rediseño de almacén de vehículo de carga

      Smith Cornejo, Neale Ricardo; Cárdenas Barrón, Leopoldo Eduardo; Garza Arrambide, Gustavo Adolfo; Tercero Gómez, Víctor Gustavo (Instituto Tecnológico y de Estudios Superiores de Monterrey, 2018-05-15)
      La gestión de inventario para productos de consumo que son perecederos, es decir con un tiempo de vida de anaquel menor a 30 días, representa un enorme reto para las áreas de logística y distribución. Este desafío se debe principalmente a temas de inocuidad, conservación de cadena de frío, así como el control del inventario almacenado dentro la unidad y el cumplimiento al programa de reparto, lo cual permita mantener niveles de servicio adecuados con los clientes. En la industria existen diferentes modelos de gestionar las ventas, para el presente caso de estudio se aplica el modelo de venta a bordo, en el cual el vendedor gestiona la venta y el reparto del producto en una única visita a los clientes. Por esta razón la administración del inventario y su acomodo dentro de la unidad resulta un tema de alta relevancia dentro de la investigación. En esta tesis, se propone una metodología para optimizar el acomodo de productos dentro de la unidad, con lo cual permita al vendedor identificar más fácilmente la ubicación de los productos y con ello reducir la distancia recorrida en la preparación de los pedidos, así como los tiempos dedicados a dicha actividad. A si mismo se evalúan algunas variantes adicionales al diseño estándar, con la finalidad de evaluar el impacto de algunas de las variables relacionadas al proceso de preparación de los pedidos. Se probaron diferentes diseños obteniendo resultados favorables en la mayoría de los diseños, aunque particularmente fueron bastante positivos los resultados obtenidos en los diseños con canastillas reducidas un 11% respecto al volumen estándar, ya que se logró disminuir la distancia promedio por “pick” un 8.7% respecto a la propuesta estándar.   Las variables que más impactan en el proceso de preparación de pedidos son el tamaño de canastillas, el volumen de los productos y la demanda.
    • Design and Implementation of a UAV-based Platform for Air Pollution Monitoring and Source Identification

      Garza Castañón, Luis Eduardo; Yungaicela Naula, Noé Marcelo; Ponce Cuspinera, Luis; Vargas Martínez, Adriana (Instituto Tecnológico y de Estudios Superiores de Monterrey, 2018-05-15)
      This document presents the thesis proposal for obtaining the Master of Science in Intelligent Systems. Technology, industry and government forecasts coincide that the planet will withstand a maximum of 50 years at the rate of current air pollution. Air pollution has reached critical levels causing major impacts on health and economy across the globe. Environmental monitoring and control agencies, as well as industries, require a reliable and cost-effective tool that is easy to deploy where required to assess contamination levels, and on that basis, take the necessary actions. Current measurement methods using pressurized balloons, satellite imagery, or earth stations result in considerable investment, as well as providing low spatial and temporal resolution. There are also systems for measuring air pollution using Unmanned Air Vehicles (UAV), which are financed by large government institutions or international organizations whose budget and resources allow costly implementations. Other related works are limited to the capture of atmospheric data using the UAVs and offline analysis. This work presents the design and implementation of an open-source UAV-based platform for measuring atmospheric pollutants and an algorithm for the localization of the air pollutant sources with the use of a UAV and in-line processing of the pollutants data. The development of the UAV-based platform includes: the UAV mounting and characterization and the control system to guide the navigation of the vehicle, the appropriate sensors selection and integration to the UAV, the data transmission from the sensors onboard the UAV to the ground station, and the implementation of the user interface which is based on a web design. The algorithm for the air pollutant source localization is based on a metaheuristic component, to follow the increasing gradient of the pollutant concentration, and complemented with a probabilistic component to concentrate the searching to the most promising areas in the targeted environment. The results of this work are: Outdoors experiments of the UAV-based platform for the air pollutant monitoring and indoor experiments to validate the algorithm for the source localization. The results show effectiveness and robustness of the UAV-based platform and of the algorithm for the source identification.