Fifth Generation Computer Systems

The project aimed to create an "epoch-making computer" with supercomputer-like performance and to establish a platform for future advancements in artificial intelligence.They asked the Japan Information Processing Development Center (JIPDEC) to indicate a number of future directions, and in 1979 offered a three-year contract to carry out more in-depth studies along with industry and academia.The primary fields for investigation from this initial project were: The aim was to build parallel computers for artificial intelligence applications using concurrent logic programming.The project imagined an "epoch-making" computer with supercomputer-like performance running on top of large databases (as opposed to a traditional filesystem) using a logic programming language to define and access the data using massively parallel computing/processing."The target defined by the FGCS project was to develop "Knowledge Information Processing systems" (roughly meaning, applied Artificial Intelligence).Soon parallel projects were set up in the US as the Strategic Computing Initiative and the Microelectronics and Computer Technology Corporation (MCC), in the UK as Alvey, and in Europe as the European Strategic Program on Research in Information Technology (ESPRIT), as well as the European Computer‐Industry Research Centre (ECRC) in Munich, a collaboration between ICL in Britain, Bull in France, and Siemens in Germany.The highly parallel computer architecture was eventually surpassed in speed by less specialized hardware (for example, Sun workstations and Intel x86 machines).[citation needed] Another problem was that existing CPU performance quickly overcame the barriers that experts anticipated in the 1980s, and the value of parallel computing dropped to the point where it was for some time used only in niche situations.Although a number of workstations of increasing capacity were designed and built over the project's lifespan, they generally found themselves soon outperformed by "off the shelf" units available commercially.During its lifespan, GUIs became mainstream in computers; the internet enabled locally stored databases to become distributed; and even simple research projects provided better real-world results in data mining.The ability of industry to produce ever-faster single CPU systems (linked to Moore's Law about the periodic doubling of transistor counts) began to be threatened.
KronosJapaneseromanizedMinistry of International Trade and Industrymassively parallel computinglogic programmingartificial intelligenceconcurrent logic programminghistory of computing hardwarevacuum tubestransistorsdiodesintegrated circuitsmicroprocessorsIBM 650IBM 7090IBM 407First generationMachine languageSecond generationLow-level programming languagesAssembly languageThird generationhigh-level programming languagesFORTRANFourth generationJapan Information Processing Development Centersupertankerautomotive industryinformation technologyJapanese languageknowledge basesdatabasesfilesystemlogic programming languageparallel computing/processingLogical InferenceEhud ShapiroHorn-clausesdefinite-clausessoftware engineeringcomputer architectureknowledge engineeringconsumer electronicsautomotiveStrategic Computing InitiativeMicroelectronics and Computer Technology CorporationEuropean Strategic Program on Research in Information TechnologyMunichSiemensPrologprocess oriented languagedataflowindeterminacyinterpreterdatabase management systemautomated theorem proverLisp machineThinking Machinesknowledge representationconcurrent constraint logic programmingworkstationsinternetclock speedsCPU power dissipationindustryMoore's Lawparallel computingmulti-coremassively parallel processinggame consolesIntel CoreAMD K10Graphics cardOpenCL