"Mohyal Educational & Research Institute of Technology" acronym as MERIT was established in October '99 by GENERAL MOHYAL SABHA (Regd.) to set up an educational society with emphasis on Information Technology. Since then, the institution has been achieving newer heights by empowering the youth of both Mohyals and Non-Mohyals. MERIT has not only contributed to youth, but has also has become a catalytic factor for other age groups to learn IT.
MERIT has set up a library at the basement of the Mohyal Foundation to cater to the faculty and students. The library has more than 600 books mainly relating to computer science and information technology, periodicals, news papers, journals, audio & video CDs and reading room. These are issued to the students and faculty on requisition. Rules have been formulated for enrolling members to the library and for issue and lending of books to the members. .
MERIT has four fully air conditioned computer labs at its premises with the latest configuration.
A conference hall or conference room is a room provided for singular events such as business conferences and meetings.\
Conveniently accessible and stylishly decorated with a mixture of Arabian and modern fixtures, ADNEC’s Conference Halls are the perfect location for any conference, banquet or special event. With a maximum capacity of up to 1,000 guests, our conference halls can be sub-divided into two, three and four smaller meeting rooms by fully retractable walls.
Fully Airconditioned classrooms with projector facility.
"PSI Services is a global leader in workforce solutions with over 70 years experience delivering successful testing programmes to help people achieve success in their academic, personal, and work lives.
MERIT had conducted a National Symposium on Big data analytics on 6th & 7th July last year. Big data analytics is the process of examining large data sets to uncover hidden patterns, unknown correlations, market trends, customer preferences and other useful business information. The primary goal of big data analytics is to help companies make more informed business decisions by enabling data scientists, predictive modelers and other analytics professionals to analyze large volumes of transaction data, as well as other forms of data that may be untapped by conventional business intelligence (BI) programs. That could include Web server logs and Internet click stream data, social media content and social network activity reports, text from customer emails and survey responses, mobile-phone call detail records and machine data captured by sensors connected to the Internet of Things. Semi-structured and unstructured data may not fit well in traditional data warehouses based on relational databases. Furthermore, data warehouses may not be able to handle the processing demands posed by sets of big data that need to be updated frequently or even continually -- for example, real-time data on the performance of mobile applications or of oil and gas pipelines. As a result, many organizations looking to collect, process and analyze big data have turned to a newer class of technologies that includes Hadoop and related tools such as YARN, MapReduce, Spark, Hive and Pig as well as NoSQL databases. Those technologies form the core of an open source software framework that supports the processing of large and diverse data sets across clustered systems.
In some cases, Hadoop clusters and No SQL systems are being used as landing pads and staging areas for data before it gets loaded into a data warehouse for analysis, often in a summarized form that is more conducive to relational structures. Increasingly though, big data vendors are pushing the concept of a Hadoop data lake that serves as the central repository for an organization's incoming streams of raw data. In such architectures, subsets of the data can then be filtered for analysis in data warehouses and analytical databases, or it can be analyzed directly in Hadoop using batch query tools, stream processing software and SQL on Hadoop technologies that run interactive, ad hoc queries written in SQL. Big data can be analyzed with the software tools commonly used as part of advanced analytics disciplines, such as predictive analytics, data-mining, text analytics and statistical analysis. Mainstream BI software and data visualization tools can also play a role in the analysis process. Potential pitfalls that can trip up organizations on big data analytics initiatives include a lack of internal analytics skills and the high cost of hiring experienced analytics professionals. The amount of information that's typically involved, and its variety, can also cause data management headaches, including data quality and consistency issues. In addition, integrating Hadoop systems and data warehouses can be a challenge, although various vendors now offer software connectors between Hadoop and relational databases, as well as other data integration tools with big data capabilities. The availability of Big Data, low-cost commodity hardware, and new information management and analytic software has produced a unique moment in the history of data analysis. The convergence of these trends means that we have the capabilities required to analyze astonishing data sets quickly and cost-effectively for the first time in history. These capabilities are neither theoretical nor trivial. They represent a genuine leap forward and a clear opportunity to realize enormous gains in terms of efficiency, productivity, revenue, and profitability.
The Age of Big Data is here, and these are truly revolutionary times if both business and
technology professionals continue to work together and deliver on the promise.