活動花絮

日期:2019-12-18

點閱:53

參考檔案:

國外教授來訪學術演講 「Intelligent Pattern Recognition and Applications to Imaging」

Professor Wang 演講實況
Professor Wang 演講實況
This talk is concerned with fundamental aspects of Intelligent Pattern Recognition and Applications to Imaging. It basically includes the following: Basic Concept of Automata, Grammars, Trees, Graphs and Languages. Ambiguity and its Importance, Brief Overview of Artificial Intelligence (AI), Brief Overview of Pattern Recognition (PR), What is Intelligent Pattern Recognition (IPR)? Finally, some future research directions are discussed.

In practice, one has to apply ML methods - which are nonparametric tools - to a data set with a finite sample size. Even so, the robustness issue is important, because the assumption that all data points were independently generated by the same distribution can be contravened and outliers habitually occur in real data sets.
The real use of regularized learning methods depends significantly on the option to put together intelligent models fast and successfully, besides calling for efficient optimization methods. Many ML algorithms involve the ability to compare two objects by means of the similarity or distance between them. In many cases, existing distance or similarity functions such as the Euclidean distance are enough. However, some problems require more appropriate metrics. For instance, since the Euclidean distance uses of the L2-norm, it is likely to perform scantily in the presence of outliers. The Mahalanobis distance is a straightforward and all-purpose method that subjects data to a linear transformation. Notwithstanding, Mahalanobis distances have two key problems: 1) the parameter vector to be learned increases quadratically as data grows, which poses a problem related to dimensionality; and 2) learning a linear transformation is not sufficient for data sets with nonlinear decision boundaries.
Models can also be selected by means of regularization methods, that is, they are penalizing depending on the number of parameters (Alpaydin, 2004; Fromont, 2007). Generally, Bayesian learning techniques make use of knowledge on the prior probability distributions in order to assign lower probabilities to models that are more complicated. Some popular model selection techniques are the Akaike information criterion (AIC), the Takeuchi information criterion (TIC), the Bayesian information criterion (BIC), the cross-validation technique (CV), and the minimum description length (MDL).
會後合照
會後合照

計畫別:全球在地逐鹿萬里--產學鏈結奇兵創業

發佈單位:資工系

上一則:教育學院多元與創新研習_融合CDIO模式於土木系新工程教育中
下一則:化學遊樂趣 高雄市 鳥松國中 圓富國中 田寮國中