Computer Technology

A. Definition
B. Computer Science and Engineering
C. Computer Technology Definitions
D. Additional Definitions

A. Definition

Computer Science is the scientific and practical approach to computation and its applications that brings together disciplines including mathematics, engineering, the natural sciences, psychology, and linguistics, etc. The Computer Science involves systematically studying methodical processes (e.g., algorithms, protocols, etc.) to aid the acquisition, representation, processing, storage, communication of, and access to information by analysing the feasibility, structure, expression, and mechanization of work processes.

Computer Engineering is a discipline that integrates the electronic engineering with computer sciences to design and develop computer systems and other technological devices including computer hardware and software. The Computer Engineering requires the theories and principles of computing, mathematics, sciences, and engineering that applies these theories and principles to solve technical problems through the design of computing hardware, software, networks, and processes.

Telecommunication is the computer science communication technology that is to exchange the information by electronic and electrical means over a significant distance including the transmission of signs, signals, messages, words, writings, images and sounds or information of any nature by wire, radio, optical or other electromagnetic systems.

Wireless Network is the computer or power supply networks without a cable connection that uses data or power transfers between network nodes at different equipment locations. The Wireless Network basis is the radio waves that uses connecting devices such as laptops and mobile phones to the Internet (LAN, Wi-Fi), mobile phone networks, wireless local area networks (WLANs), wireless sensor networks, and terrestrial microwave networks, and so on.

B. Computer Science and Engineering

Application Program (or Application Software, App) is a comprehensive, self-contained program that is written and designed for a specific need or purpose. (e.g., word processor, spreadsheet, web browser, email, media player, file viewer, etc.) (Opposite of the System Software which is the infrastructure in the computer. (e.g., operating system, utilities and related components))

Artificial General Intelligence (AGI) is a branch of theoretical Artificial Intelligence (AI) research working to develop AI with a human level of cognitive function, including the ability to self-teach that is a concept of both aspirational and practical consequences. AGI has been defined as an autonomous system that surpasses human capabilities in the majority of economically valuable tasks. Researchers and experts in the AI community are actively working on advancing AI technologies, but as of my last knowledge update in January 2022, AGI had not been realized, and creating truly autonomous, general-purpose AI remains a complex and challenging task.

Artificial Intelligence (AI) is a branch of computer science that is an advanced computer programming language aimed at enabling computers to copy, emulate, or simulate the human intelligence. AI is the ability of a digital computer or computer controlled robot to perform tasks commonly associated with the human intelligence. The AI processes include the acquisition of information, using or application, and approximating conclusions by which topics including the problem solving, reasoning, planning, natural language, programming, and machine learning, etc. (Refer to the AI (Artificial Intelligence) Capabilities)

AI (Artificial Intelligence) Capabilities: It is useful to think of AI capabilities in four main categories: 1) Perception involves collecting and interpreting information to sense the world and describe it. These capabilities include natural language processing, computer vision, and audio processing; 2) Prediction involves using reasoning to anticipate behaviours and results. Such technologies are used, for example, to develop precisely targeted advertising for specific customers; 3) Prescription is principally concerned with what to do to achieve goals. It has a variety of use cases, including route planning, drug discovery, and dynamic pricing; 4) Last but not least, AI can be combined with complementary technologies such as robotics to provide integrated solutions. These include autonomous driving, robotic surgery, and household robots that respond to stimuli. (Source: www.mckinsey.com/)

Autocorrection is the internal correlation data correction performed by the computer without human intervention.

Autogeneration is 1) the generation of electricity by an industrial to make needs of its own operations. (e.g., Combined Heat and Power); 2) in computer science, Autogeneration (Automatic Programming) is a type of computer programming that mechanism generates a computer program to allow programmers to write the code at a higher abstraction level. (e.g., AI (Artificial Intelligence) and Autogeneration of Code); 3) the automatic creation of something inside or internal, without any other involvement.

Big Data is the large volume of or complex information management strategy that includes and integrates many new types of data in which structured and unstructured, and data management alongside traditional data. The Big Data management includes data search and gathering, storage, analysis and validating, sharing, and communicating, and updating in a business day to day basis. The Big Data can be analysed for insights that leads to better decisions and strategic business moves. There are four dimensions to Big Data known as Volume, Variety, Velocity and Value.

Blockchain is a distributed database that maintains a continuously growing list added to it with a new set of recordings (blocks). The Blockchain is a technology underlying bitcoin and other cryptocurrencies that is a shared digital ledger, or a continually updated list of all transactions between the private and public networks. (Refer to the Bitcoin)

CAD/CAM (Computer-Aided Design/Computer-Aided Manufacturing) is software used to design products such as electronic circuit boards in computers and other devices.

Cloud Computing is the practice of using a network of remote servers hosted on the Internet to store in servers, databases, networking, and software, rather than a local server or a personal computer. The Cloud Computing is generally used to describe data centres available to many users over the Internet, and the name cloud computing was inspired by the cloud symbol that's often used to represent the Internet in flowcharts and diagrams. (Refer to the Solution as a Service (SaaS))

Cyber Attack (or Cyberattack) is any type of illegal attempt to harm computer systems or the information using the internet that targets computer information systems, infrastructures, computer networks, and/or personal computer devices by various means of malicious acts usually originating from an anonymous source that either steals, alters, or destroys a specified target by hacking into a susceptible system.

DeepMind is the world leader in artificial intelligence research and its application for positive impact. DeepMind was founded in London in 2010 and backed by some of the most successful technology entrepreneurs in the world. Having been acquired by Google in 2014, we are now part of the Alphabet group. We continue to be based in our hometown of London, alongside some of the country's leading academic, cultural and scientific organisations in the King's Cross Knowledge Quarter. (Source: https://deepmind.com/)

Deep Learning is a subfield of machine learning concerned with algorithms inspired by the structure and function of the brain called artificial neural networks. Deep Learning with massive amounts of computational power, machines can now recognise objects and translate speech in real time. Artificial Intelligence (AI) is finally getting smart.

Global Positioning System (GPS) is a global navigation satellite system that provides location, velocity and time synchronisation that is a technology using satellites and portable receivers to determine exact positions on the earth's surface. The GPS is used to support a broad range of military, commercial, and consumer applications that was first used by the United States military in the 1960s, and owned by the United States government and operated by the United States Space Force.

In Vehicle Infotainment (IVI) is an automobile industry term that combine entertainment and information delivery to drivers and passengers. An IVI includes car audio systems with radios and cassette or CD players, as well as automotive navigation systems, video players, USB and Bluetooth connectivity, computers, in-car internet, and WiFi. The entire automotive industry is moving towards developing innovative technologies to enable better connectivity solutions, improve vehicle safety, and enhance in-vehicle user-experience.

Information Technology (IT) is a set of tools, processes, and methodologies used for the computing technologies, such as networking, hardware, software, the Internet, or the people that works with these technologies as well as other information distribution technologies such as television and telephones. An IT industry includes a computer hardware, software, electronics, semiconductors, internet, telecom equipment, engineering, healthcare, e-commerce, computer services, and also office automation, multimedia, and telecommunications.

Infotainment is a neologistic portmanteau of information and entertainment such as television programs that is intended both to present information and entertainment. The Infotainment can also refer to the hardware and software products and systems which are built into, and the term infotainment is also frequently applied to devices designed to serve infotainment content, such as in-car entertainment and information systems (in-vehicle infotainment).

Internet of Things (IoT) is an emerging technology of technical, social, and economic significance that is a global infrastructure for the information society, enabling advanced services by interconnecting things based on existing and evolving interoperable information and communication technologies. The IoT is the network connectivity and computing capability extends to objects, sensors and everyday items allowing these devices to generate, exchange and consume data with minimal human intervention.

Machine Learning is a type of artificial intelligence (AI) that is a method of data analysis using algorithms, and one of the most important technical developments in the field of AI. It provides computers with the ability to learn without being explicitly programmed. The Machine Learning is the subfield of computer science that has given us self-driving cars, practical speech recognition, effective web search, and a vastly improved understanding of the human genome.

Non-Fungible Token (NFT) is a unique and non-fungible (non-interchangeable) unit of data stored on a digital ledger (blockchain) that is a digital asset and represents real-world objects like art, music, in-game items, and videos. A NFT is bought and sold online, generally built using the same kind of programming as cryptocurrency, like Bitcoin or Ethereum, but that’s where the similarity ends. Each NFT has a digital signature that makes it different from other NFTs to be exchanged for or equal to one another (hence, non-fungible).

Office Information System (OIS) is an advanced word processing system of the hardware, software and processes that provides the technical support and service for the timely retrieval of accurate information by computerised systems to enable effective planning, operation and monitoring of services.

Online Facilitator is a facilitator who is responsible for ensuring that the communication stay on topic at the e-consultation or e-learning through the online.

Photogrammetry is the science of making measurements from photographs by using of computer analysis of photographs and recognised scientific tools. The Photogrammetry is using of photography in surveying and mapping to ascertain measurements between objects, analysing multi images with references points, especially for recovering the exact positions of surface points or creating a 3D model. The word Photogrammetry may be analysed in two parts: photo (picture) and grammetry (measurement). The input to photogrammetry is photographs and computer programs, then the output is typically a 3D map, drawing, dimensions, or a 3D model of real objects. Computer algorithms are improving a speed and accuracy of result in the Photogrammetry process.

Programme (Program) is 1) a plan of works or activities with details to be achieved; 2) In computing, a programme is a specific set of ordered operations for a computer to perform.

Quantum Computing is the area of study focused on developing computer technology based on the principles of quantum theory that make direct use of quantum-mechanical phenomena, such as superposition and entanglement, to perform operations on data. Quantum Computing is a new generation of technology that involves a type of computer 158 million times faster than the most sophisticated supercomputer we have in the world today. (Refer to the Quantum Theory)

Three Dimensional Modelling (3D Modelling) is used to generate 3D models for sectors ranging from engineering and manufacturing to digital animation for films and video games. The first uses of computer graphics were in the early 1960s for scientific and engineering purposes, and CGI artistic expression began by the late 1960s. The first commercially available solid modeler program named Syntha Vision was released in 1969. It wasn't until 20 years later that NURBS and parametric modelling appeared on the scene, the latter being the birth of Pro/ENGINEER. 3D Modelling grew in popularity and utility, with 3D modelling applications ranging from film and video games to all aspects of commercial design and manufacturing. The first 3D printing technology, Stereolithography (SLA), emerged in 1986 and lends its name to the now popular STL file. (Source: www.sculpteo.com/)

Three Dimensional Printer (3D Printer) is a machine used to create a three-dimensional object that produces layer by layer from a 3D files. The first 3D Printer was created in 1984 by Charles Hull who developed the first model based on the principle of stereolithograpy (STL). There are now many 3D printer models using different technologies like Fused Deposition Modelling (FDM)which molten polymer is sprayed on a support layer and the model is built layer by layer.

C. Computer Technology Definitions

Airtime is the amount of time during which a wireless broadcast is being transmitted.

Analog is a comparison of two things that is relating to a device or process in which data is represented by physical quantities that change continuously varying signal.

Authentication is a process or action for verifying the correctness of a piece of data.

Binary Digit (BIT) is used in the pure binary numeration system that is 0 and 1 in the Binary Code.

Bit is 1) a small piece, part, or quantity of something; 2) a unit of information in a computer that must be either 0 or 1.

Bits per Second (BPS) is a measure of transfer speed in digital communication networks. One kilobit per second (Kbps) equals 1,000 bits per second (bps). One megabit per second (Mbps) equals 1000 Kbps or one million bps. One gigabit per second (Gbps) equals 1000 Mbps, one million Kbps or one billion bps.

Bluetooth is a short-range radio technology that allows devices (mobile phones, computers, and other electronic devices) each other to communicate wireless interconnection.

Byte

D. Additional Definitions

Action Tracking is a method of logging progress on internet that is a cookie-based method to track actions and page visits.

AlphaGo is a computer program developed by Google DeepMind in London to play the board game Go. In October 2015, it became the first Computer Go program to beat a professional human Go player without handicaps on a full-sized 19×19 board. In March 2016, it beat Lee Sedol in a five-game match, the first time a computer Go program has beaten a 9-dan professional without handicaps. AlphaGo's algorithm uses a Monte Carlo tree search to find its moves based on knowledge previously "learned" by machine learning, specifically by an artificial neural network (a deep learning method) by extensive training, both from human and computer play. (Source: Wikipedia)

Backhauling is 1) the transportation in the reverse direction to the main from its principal haul. (e.g., the transportation of gas apparently in the reverse direction to the main flow of the pipeline; 2) in telecommunications, the physical part of a communications network between the central backbone and the individual local networks; 3) wireless backhaul is the wireless communication and network infrastructure responsible for transporting communication data from end users or nodes to the central network or infrastructure and vice versa.

Bitcoin is a cryptocurrency and a payment system that is a type of digital currency in which a record of transactions is maintained and new units of currency are generated by the computational solution of mathematical problems, and which operates independently of a central bank. The Bitcoin is invented by an unidentified programmer under the name of Satoshi Nakamoto that was introduced in 2008 and released as open-source software in 2009.

Browsewrap Agreement is a term used in Internet law to refer to a contract or license agreement that describes the situation where the terms of use (TOU) or terms of service (TOS) of the use of a website can be viewed on a linked page but the user of the website is not required to click a button to acknowledge.

Cognitive Computing is the simulation of human thought processes to build the neural networks and deep learning in a computerised model that covers several disciplines, including machine learning, natural language processing, reasoning, speech recognition, vision, and human-computer interaction among other technologies. The Cognitive Computing systems often need to weigh conflicting evidence and suggest an answer that is best rather than right.

Data Acquisition (DAQ) is the process of sampling signals that measures the physical phenomena and converts to into a digital form. A DAQ system consists of sensors, hardware, and a computer with programmable software, and provides more powerful, flexible, and cost-effective measurement solution.

Default Value

More Definitions – visit to the Shop!