Investigators:
David Banks
Description:
The goal is to support research in brain repair
and neural regeneration by improving display
of time-varying 3D brain data, including data
from confocal microscopy, MRI, fMRI, and diffusion
tensor imaging.
2. Realistic Illumination
Investigators:
David Banks
Description:
Apply global illumination to complicated surfaces
arising as isosurfaces of 3D data. Particular
example: realistic images of the human brain from
3D scans.
3. Flowspace
Investigators:
David Banks
Description:
Multidimensional analysis of features in time
varying vector fields. The goal is to unify the
way features (streamlines, streaklines, timelines,
critical points, vortices, shocks) are represented
mathematically and displayed graphically.
4. Web-based 3D Simulation
Investigators:
David Banks
Description:
We are developing instructional modules to show
3D simulations of optics for undergraduate
students to interact with. We use Java+EAI+VRML
to coordinate simulation and 3D display. Called
"The Optics Project" (TOP), the system is being
extended to support scripting and collaboration.
5. Vector Field Visualization
Investigators:
David Banks
Description:
Analysis and display of 3D time-varying vector
fields from engineering and science applications.
6. Performance Support for SCORM Compliant Online Training
Investigators:
Ian Douglas
Description:
This project involves developing performance support and automated
design
tools for the US navy. The tools will help in the transition to the
sharable courseware object reference model (SCORM), which is part of
the Advanced Distributed Learning initiative of the Department of
Defense.
The work incorporates research into design notations, human factors and
critiquing systems.
7. Problem-Solving Tools for Distributed Applications
Investigators:
Robert van Engelen, Kyle Gallivan
Description:
SOAP (Simple Object Access Protocol) is a versatile new remote
procecedure
calling mechanism that operates over the Internet. SOAP is lightweight
and
adopts two existing technologies to build distributed applications over
the Internet: XML and HTTP. SOAP has many advantages over CORBA and
Java
technologies in terms of its lightweight and platform-independent
interoperability and security issues. SOAP adopts RPC (remote procedure
calling) with an XML marshalling format. This requires XML
serialization
capabilities by SOAP-enabled applications. It is a difficult
programming
task for SOAP wrapper programmers to implement XML serialization in C
and
C++. We developed a SOAP stub compiler for C that automatically
generates C data structure serialization wrappers and remote procedure
calling stubs. The stub compiler enables C and C++ SOAP
interoperability
over the Internet with other SOAP-enbabled applications (e.g. written
in
Java, Perl, Visual Basic, etc). The main goal of the research is tool
development for wrapper generation.
More information can be found at:
http://websrv.cs.fsu.edu/~engelen/soap.html
8. Problem-Solving Tools for Weather Forecast Systems
Investigators:
Robert van Engelen
Description:
The Ctadel system is a code generation system that automatically
generates
efficient FORTRAN code for serial, vector, and parallel systems. The
Ctadel
system "compiles" a weather forecast system down to optimized FORTRAN
source code using a customized computer algebra system. Other
applications
are under investigation, such as coupled ocean-climate models.
More information can be found at:
http://websrv.cs.fsu.edu/~engelen/imacs99aca.html and
http://websrv.cs.fsu.edu/~engelen/ctadel/dyn/report.html
9. Loop Optimizations
Investigators:
Robert van Engelen
Description:
Loop optimization is the main target of many optimizing and
restructuring
compilers that produce efficient code. An accurate determination of
induction variables and dependencies in loops is of paramount
importance
to many loop optimization and parallelization techniques, such as
generalized loop strength reduction, loop parallelization by induction
variable substitution, and loop invariant expression elimination. We
developed a new method for generalized induction variable recognition.
Existing methods are either ad-hoc and not powerful enough to recognize
some types of induction variables or existing methods are powerful but
not safe. We are developing loop optimization methods that are safe,
simple to implement in a compiler, better adaptable for controlling
loop
transformations, and that recognize a large class of induction
variables.
More information can be found at:
http://websrv.cs.fsu.edu/research/reports/TR-000102.ps
10. Probabilistic Networks
Investigators:
Robert van Engelen
Description:
Probabilistic networks (also called Bayesian belief networks) are
used
in
medical expert systems, general advisory expert systems, image
recognition,
information retrieval, etc. We are investigating approximation
techniques
to speed up diagnostic inference processes.
More information can be found at:
http://www.wi.leidenuniv.nl/TechRep/1996/tr96-15.html
11. Visual Perception Modeling and Its Application
Investigator:
Xiuwen Liu
Description:
We explore computational models for visual perception with emphasis
on real world applications. We have proposed a textural feature named
spectral histogram, which outperforms sigficantly a large number of
existing methods in a systematic comparison on texture classification.
Currently we are exploring its applications in remote sensing image
classification, face recognition, effective image compression based
on object representation, and object recognition.
More information can be found at our group's web page
http://websrv.cs.fsu.edu/fsvision.
12. Visual Interfaces for Computers
Investigators:
Xiuwen Liu and Anuj Srivastava
Description:
We are investigating a novel computational framework for visual
recognition that will support visual interfaces for computers.
One of the key limitations of current approaches is that existing
methods do not perform well in real world environment due to their
poor generalization capability. We propose a new framework based on
Monte Carlo Markov chain techniques to systematically choose visual
features which are guaranteed to perform well in new environments.
Active projects include face recognition, face expression modeling
and recognition, and generic 3-D object recognition.
For more information, please visit our web sites at
http://websrv.cs.fsu.edu/fsvision and
http://calais.stat.fsu.edu.
13. Activity Profiles for Intrusion Detection
Investigators:
Ladislav Kohout, Alec Yasinsac, Ernest McDuffie
Description:
We are concerned with application of fuzzy logics and relational
computational algorithms to forming activity profiles in information
processing systems in order to distinguish desirable form undesirable
activities. We apply BK-products of relations and fuzzy measures
(which subsume probability as a special case). Fuzzy logics allow us
to create possibility profiles and usuality profiles and combine these
with probability profiles when necessary to detect unusual or abnormal
activities. Fast Fuzzy Relational Algorithms that can be executed on
conventional computing architectures as well as soft computing
architectures, in particular Neuro-Fuzzy Networks can be used for
computations. The BK-nonassociative relational products in the
feedback loop used instead on the traditional back-propagation speed
up training of the networks considerably.
More information about fuzzy relational methods can be found in two
tutorials, namely:
(i) Slides for 3 hour tutorial IEEE-FUZ2000 on Applications of
BK-products
and Fuzzy Relations at
http://websrv.cs.fsu.edu/~kohout/tut00.ps
(ii) 1 day tutorial: Relational Semiotic Methods for Design of
Intelligent
Systems IEEE-ISIC/CIRA/ISAS'98 presented at NIST.
at http://websrv.cs.fsu.edu/~kohout/tut98fin.ps
14. A Knowledge-Based Management Decision Tool for Decision Making
with
Incomplete Information Incorporating Cost Modeling and Affordability
Assesment Component
Investigators:
Ladislav Kohout, Ernest McDuffie, G. Zenz (FSU College of
Business)
Description:
Supported by the grants from the US Air Force Mantech & NSF-MOTI
DMI
9525991 (in collaboration with Pratt&Whitney), and NSF-DMI 9726027 we
have developed methods for Knowledge elicitation and relational
representation of substantive knowledge (concepts, linguistic
descriptors, physically measurable parameters and interactions) that
are relevant to manufacturing using intelligent systems and are
applicable not only to technical but also human and organizational
subsystems of a total manufacturing production system.
These techniques we have developed as part of a lager project concerned
with questions of affordability of new High-Tech products in the
aviation industry that were never manufactured before. A systematic
accounting of incomplete or conflicting information and constraints is
difficult in this context. In our project, a new methodology based on
fuzzy relational computations has been applied to provide effective
methods for dealing with such problems. In our work we have addressed
a
number of issues, two of which are relevant in this context:
(1) Use of Fuzzy Relational Methods and BK-compositions of relations
for
data and knowledge elicitation and representation, as well as for
affordability modeling.
(2) Value Analysis for Integration of Technology and Business. The
models for value analysis also use fuzzy relational techniques.
We are interested in continuing this work that involves exploring fuzzy
set and relational methods. We are seeking industrial partners that are
interested in question of incorporating cost models into engineering
design that uses distributed computing tools and would provide a new
application area and data for validation of the techniques we have so
far developed.
A potential new application of our techniques of great importance would
be A potential new application of our techniques of great importance
would be to use these in security evaluation of Information
Technology.
"Trusted Network Interpretation", the so called 'Red Book' (which is
the
continuation of the US TSSEC - 'Orange Book') attempts to address this
problem with concepts and terminology introduced in the Orange Book.
These concepts include "granularity" of security levels, "strengths" of
security mechanisms, cost of mechanisms with different degree of risk
etc. Such concepts can be handled with advantage by fuzzy sets and
measures in computer interpretable way.
15. Knowledge Networking with OpenMath
Investigators:
Ladislav Kohout, Lois Hawkes, Mike Seppala (FSU Dept of
Mathematics)
Description:
The purpose of this project is to develop a scheme for knowledge
networking which utilizes the OpenMath international protocol
structures
but at the same time extends these, incorporating linguistic and
non-mathematical symbolic representations and communication schemes.
While the core of the OpenMath would form a proper part of our scheme,
it will be extended in the following ways:
Incorporating mechanisms for dealing with uncertainty,
indeterminacy,
and incompleteness of information and knowledge; and for dealing
with
conflicting information.
Enriching the scheme by introducing the linguistic components
and mechanisms for "computing with words".
Incorporating mechanisms for achieving maximal trustworthiness
of
systems. In particular achieving good dynamic protection of data and
knowledge not only against intruders but also against combining
inappropriate data and knowledge caused by possible malfunctioning
of participating networked agents/systems.
Incorporating the means for performing trustworthy measurements
yielding engineering and scientific data of mathematical character,
as well as trustworthy acquisition of cognitive and linguistic
information.
We are building on the experience in the development of the OpenMath
international standards/protocols for sharing/exchange of mathematical
information (both data and knowledge) that enables distributed
co-operation of agents/systems with different kinds of competence.
Fuzzy sets, relations and fuzzy BK-products of relations play essential
role in extensions (1) -- (4).
While the OpenMath scheme allows for distributed co-operative computing
involving mathematical agents with different kinds of mathematical
competence e.g. combined distributed action of numerical and symbolic
mathematical systems (e.g. Mathematica, Matlab, Maple, Axioma etc.)
requirements of engineering design, decision making in business,
environmental protection, medicine and social planning require also
information and knowledge of non-mathematical kind to be incorporated.
This, however, does not eliminate the computational component of
mathematics so important in engineering design, but make it a part of
the extended scheme also containing linguistic and non-mathematical
symbolic representations and communication schemes.
We are seeking industrial collaborators for the purpose of submitting
grant proposals to NSF and other federal agencies. In this work, it is
essential to validate the ideas by prototypes ranging over different
application areas. Andreas Strotmann (a Ph.D. student of Prof. L.J.
Kohout) has been working in collaboration with Dept. of Oceanography on
linking Prof. Robert van Engelen's CTADEL system with other symbolic
tools using OpenMath standard. We are collaborating with Dr. Basil
Savitsky of the FSU Dept. of Geography on interpretation of remotly
sensed satelite data by hybrid distributed intelligent system and
linking these to GIS systems. Because the satalite scenes are concerned
with land data, quality and information contents of the satalite images
can be validated by comparison with ground data collected by the field
work. This applies in particular to the forestry data. Hence not only
Internet but also a wireless compatible extension to OpenMath systems
is essential for this kind of application.
16. Random Number Generation Research
Investigators:
Michael Mascagni
Description:
Dr. Michael Mascagni's Florida State University Department of
Computer
Science group in random number generation is developing
high-performance mathematical software for both pseudorandom and
quasirandom number generation. This work has resulted in the highly
popular Scalable Parallel Random Number Generators (SPRNG) library.
This research utilizes skills from Computer Science, Discrete
Mathematics, Computational Number Theory, and Statistics, and is very
interdisciplinary. This work is funded by the Department of Energy's
ASCI program, the National Science Foundation, and the Army Research
Office, and involves several national and international collaborations.
17. Research into Monte Carlo and Quasi-Monte Carlo Methods and
Applications
Investigators:
Michael Mascagni
Description:
Dr. Michael Mascagni's Florida State University Department of
Computer
Science group in Monte Carlo and quasi-Monte Carlo methods is involved
in both the development and refinement of new algorithms and in their
application to problems in materials science, environmental science,
finance, and biochemistry and neuroscience. Of particular interest is
the application of quasi-Monte Carlo methods to Markov-chain based
problems. This work is supported by the National Science Foundation
and the Army Research Office, and involves several national and
international collaborations.
18. Parallel, Distributed, and Grid-Based Computing Research
Investigators:
Michael Mascagni
Description:
Dr. Michael Mascagni's Florida State University Department of
Computer
Science group in parallel, distributed, and grid-based computing is
developing computational infrastructure to support the large
computations required in the software and applications research in the
two other related groups. Infrastructure for naturally parallel
computations that arise in computational number theory, random number
generation, and Monte Carlo methods are studied and developed.
19. GNOSYS: A Next-Generation Knowledge Management System
Investigator:
Daniel Schwartz
Description:
This project is developing GNOSYS, a next-generation knowledge
management
system. The work applies recent advances in artificial intelligence,
2D
and 3D graphics, and Internet/intranet technology, to the design and
development of indexes for large distributed digital libraries. The
indexes take the form of concept taxonomies (or semantic networks, or
ontologies), and so have a much richer semantic structure that simple
trees. In addition, they are created by their own communities of users
and thus serve as knowledge bases that grow and evolve over time. An
underlying semantics and reasoning algorithm are provided, which enable
users to query the index as to the deeper relations between
classification categories. Advanced graphics techniques facilitiate
browsing, helping users find their way through these more complex
structures without becoming confused or lost.
20. A Comprehensive, Retargetable Embedded Systems Software
Development
Environment
Investigators:
David Whalley, Kyle Gallivan, Robert van Engelen, Xin Yuan
Description:
Application developers face many challenges when developing code
for embedded systems. Traditional optimizing compiler technology
is often of little help in meeting the constraints associated with
such systems. Thus, many embedded systems applications are developed
in assembly code in order to meet the constraints on speed, space,
power, cost, etc. We are developing a compiler back end to support
embedded systems development. Besides retargeting our compiler to
different embedded systems (our focus will be on DSPs), we have the
following goals:
a. support interactive compilation
We will allow a user to interactively direct the compilation
of an application. Features will include the ability to view
the low-level program representation in a graphical display, to
specify a sequence of optimization phases (possibly repetitive)
to be performed, to specify transformations by hand to take
advantage of unusual architectural features that the compiler
cannot exploit, and to undo previous transformations to support
experimentation.
b. support iterative compilation
We plan to allow a user to specify constraints on code portions,
such as speed, size, and power. We will have the compiler
automatically attempt different combinations of optimizations
in an attempt to meet these constraints.
c. develop compiler optimizations for architectural features commonly
found in embedded systems (zero overhead loop buffers, modulo
address
arithmetic, etc.)
21. Automatic Validation of Code-Improving Transformations
Investigators:
David Whalley, Robert van Engelen, Xin Yuan
Description:
We are developing techniques to automatically determine if a
code-improving transformation to a program is semantically correct.
We do this by:
a. detecting the changes associated with a transformation and the
region of code affected
b. calculating the effects of the region before and after the
transformation
c. comparing the calculated effects to see if they are identical
This technique will be valuable for checking the validity of
hand-specified transformations for embedded systems.
22. Security Protocol Verification
Investigators:
Alec Yasinsac
Description:
Security protocol verification is concerned with ensuring that,
where
cryptography is used to solve security problems, it is applied
effectively.
Security Protocols are notoriously difficult to analyze and flaws in
them
can completely compromise the goals of those that employ them. Our
efforts
in this area are directed toward establishing a workbench for security
protocol analysts. The workbench is founded on the formal method
analysis
technique that we developed for this purpose.
More information can be found at:
http://websrv.cs.fsu.edu/~yasinsac/framewk2.pdf
23. Intrusion Detection
Investigators:
Alec Yasinsac, Ernest McDuffie
Description:
We are also concerned with Intrusion Detection in the security
protocol
environment. Our approach is to analyze executing security protocols to
detect and respond to attacks and to learn more about vulnerabilities
in the
protocols. We are studying ways to apply artifical intelligence to
reduce
the search space when analyzing user profiles and to employ
intelligent,
mobile agents to gather activity in encrypted environments.
More information can be found at:
http://websrv.cs.fsu.edu/~yasinsac/IDSP.pdf
and
http://websrv.cs.fsu.edu/~yasinsac/NSPW.pdf
24. Computer and Network Forensics
Investigators:
Alec Yasinsac
Description:
The overwhelming majority of all security research is spent on
prevention
of malicious behavior. We investigate methods to detect ongoing
attacks,
minimize the damage, and to respond in positive ways during and after
the
attack. We begin by addressing the anatomy of attacks and propose
policies
that enterprise owners can employ to enhance their ability to respond
to
ongoing and completed attacks.
More information can be found at:
http://websrv.cs.fsu.edu/research/reports/TR-000902.ps
25. Cryptography
Investigators:
Alec Yasinsac, Lois Hawkes, and Yvo Desmedt
Description:
This area of investigation is in provable security and
cryptographic
techniques. We offer a novel method of authentication and show that we
achieve provable entity authentication in fewer steps than has been
done
before.
More information can be found at:
http://websrv.cs.fsu.edu/research/reports/TR-001001.pdf
26. High Performance Message Passing Library for Clusters of
Workstations
Investigators:
Xin Yuan
Description:
As microprocessors become more and more powerful, clusters of
workstations have become one of the most common high performance
computing environment. One of the key building blocks for such
systems is the message passing library. Standard message passing
libraries, including MPI and PVM, have been implemented for such
systems. Current implementations, such as MPICH and LAM/MPI, focus on
providing the functionality and addressing the portability issues.
These implementations are built over point-to-point communication
primitives supported by the TCP/IP protocol suite, which was designed
for the Internet and may not be suitable for high performance
communication. This research aims at developing a high performance,
customized MPI library for clusters of workstations by exploiting the
multicast capability of LANs.
27. Compiled Communication for Clusters of Workstations
Investigators:
Xin Yuan
Description:
Traditional communication optimizations focus either on improving
the communication performance of communication routines by taking into
consideration special features in the underlying network architecture
or on reducing the communication requirement in a program by developing
communication optimization algorithms in the compiler. Compiled
communication attempts to exploit more communication optimization
opportunities by considering simultaneously both the underlying network
architecture and the communication patterns in an application program.
This research will study the compiler and run-time support for compiled
communication over clusters of workstations.