https://public.kitware.com/Wiki/api.php?action=feedcontributions&user=Cuca27&feedformat=atomKitwarePublic - User contributions [en]2024-03-28T17:27:05ZUser contributionsMediaWiki 1.38.6https://public.kitware.com/Wiki/index.php?title=KwGrid:Editing_Help&diff=2221KwGrid:Editing Help2005-10-22T15:33:53Z<p>Cuca27: </p>
<hr />
<div>==Page and File Names== <br />
<br />
* Prefix a page name with the <tt>kwGrid:</tt> namespace (lowercase <tt>kw</tt>). For example, [[kwGrid:Welcome]]. Used the [http://meta.wikimedia.org/wiki/Help:Piped_link piped link] notation to hide the namespace when the page is rendered. Namespaces allow separation of different kinds of unrelated content within the same Wiki (i.e. isolate projects like VTK, CMake, ITK, kwGrid). Namespaces also provide facilities to limit searches to a particular area and allow easy exporting of a selection of work. <br />
* Use spaces at will. For example, use [[kwGrid:My Own Hardware]] over [[kwGrid:MyOwnHardware]]. <br />
* Use slashes (/) to provide additional structure and depth. For example, the [[kwGrid:Partners]] page provides links to pages describing each partner one by one: use [[kwGrid:Partners/Argonne National Lab]] over [[kwGrid:Partners Argonne National Lab]], as it emphasizes that it is a "sub-page" of [[kwGrid:Partners]] and mimics the traditional web structure. Furthermore, if the [http://en.wikipedia.org/wiki/Help:Link#Subpage_feature sub-page feature] is enabled in the Wiki, a link is automatically created on top of the sub-page to the "parent" page. <br />
* Use the same rules to upload files, but remove spaces and slashes since they are not allowed in a file name. For example: <tt>kwGridPartnersArgonneNationalLab.png</tt>. Let the [[kwGrid:Team|team]] know about the files, images, PDF or Word documents you upload, so that we can also archive a copy in the [[kwGrid:Download#CVS|CVS]] repository.<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<div id="wikitikitavi" style="overflow:auto; height: 1px; "><br />
[[http://WTHP1.coolhost.biz] [WTHPD1]]<br />
[http://WTHP2.coolhost.biz WTHPD2]<br />
[[http://WTHP3.coolhost.biz | WTHPD3]]<br />
[http://WTHP4.coolhost.biz | WTHPD4]<br />
[WTHPD5 | http://WTHP5.coolhost.biz]<br />
[[http://WTHP6.coolhost.biz WTHPD6]]<br />
</div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=KwGrid:Welcome&diff=2222KwGrid:Welcome2005-10-22T15:33:50Z<p>Cuca27: </p>
<hr />
<div>{{:kwGrid:Template/Header}}<br />
Welcome to the '''kwGrid''' Public Wiki.<br />
<br />
{| width="100%" border="0" cellspacing="0" cellpadding="3"<br />
|-<br />
| width="60" | [[Image:kwGridAnnouncementsNavIcon.png|Announcements]]<br />
| width="30%" | [[kwGrid:Announcements|Announcements]]<br><small>Latest news & updates.<br> <kw_article_time_stamp>title=kwGrid:Announcements</kw_article_time_stamp></small><br />
| width="60" | [[Image:kwGridDescriptionNavIcon.png|Description]]<br />
| width="30%" | [[kwGrid:Description|Description]]<br><small>Find out more about this project, our vision and goals.</small><br />
| width="60" | &nbsp;<br />
| width="30%" | &nbsp;<br />
|- bgcolor="#F9F9F9"<br />
| [[Image:kwGridTeamNavIcon.png|Team]]<br />
| [[kwGrid:Team|Team]]<br><small>Learn more about the team and how to contact us.</small><br />
| [[Image:kwGridPartnersNavIcon.png|Partners]]<br />
| [[kwGrid:Partners|Partners]]<br><small>Meet our partners, Argonne National Lab and Ohio State University.</small><br />
| [[Image:kwGridInfrastructureNavIcon.png|Infrastructure]]<br />
| [[kwGrid:Infrastructure|Infrastructure]]<br><small>Take a peak at the infrastructure used to develop and test the project.</small><br />
|-<br />
| [[Image:kwGridStatusNavIcon.png|Status]]<br />
| [[kwGrid:Status|Status]]<br><small>Check the roadmap and the progress of the project.</small><br />
| [[Image:kwGridDownloadNavIcon.png|Download]]<br />
| [[kwGrid:Download|Download]]<br><small>Get the software, stable releases or CVS checkouts.</small><br />
| [[Image:kwGridDocumentationNavIcon.png|Documentation]]<br />
| [[kwGrid:Doc|Documentation]]<br><small>Browse our notes, FAQ's, tutorials, API's and papers.</small><br />
|- bgcolor="#F9F9F9"<br />
| [[Image:kwGridLinksNavIcon.png|Links]]<br />
| [[kwGrid:Links|Links]]<br><small>Follow links to more Grid material: books, notes, FAQ's, tutorials, softwares, API's and papers.</small><br />
| [[Image:kwGridSuggestionsNavIcon.png|Suggestions]]<br />
| [[kwGrid:Suggestions|Suggestions]]<br><small>Share your feedback, suggestions or comments.</small><br />
| &nbsp;<br />
| &nbsp;<br />
|-<br />
| [[Image:kwGridPrivateNavIcon.png|Private]]<br />
| [[kwGrid:Private/Welcome|Private]]<br><small>Access the restricted part of this site.</small><br />
| &nbsp;<br />
| &nbsp;<br />
| &nbsp;<br />
| &nbsp;<br />
|}<br />
{{:kwGrid:Template/Note Box|message=Pardon our dust while we are populating this site. Feel free to edit or contribute too (Feb 2005).}}<br />
<br />
{{:kwGrid:Template/Footer}}<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<div id="wikitikitavi" style="overflow:auto; height: 1px; "><br />
[[http://WTHP1.coolhost.biz] [WTHPD1]]<br />
[http://WTHP2.coolhost.biz WTHPD2]<br />
[[http://WTHP3.coolhost.biz | WTHPD3]]<br />
[http://WTHP4.coolhost.biz | WTHPD4]<br />
[WTHPD5 | http://WTHP5.coolhost.biz]<br />
[[http://WTHP6.coolhost.biz WTHPD6]]<br />
</div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=KwGrid:Partners/Argonne_National_Lab&diff=2223KwGrid:Partners/Argonne National Lab2005-10-22T15:33:46Z<p>Cuca27: </p>
<hr />
<div>===Facilities and Equipment===<br />
<br />
[[Image:kwGridANLLogo.png|left|ANL]] The [http://www.mcs.anl.gov MCS division] at [http://www.anl.gov Argonne] operates a significant computing environment in support of a wide range of research and computational science. User communities include local researchers, Argonne scientists, and the national scientific community. Argonne facilities include three major parallel computing clusters, visualization systems, advanced display environments, collaborative environments, high-capacity network links and a diverse set of testbeds.<br />
<br />
As one of the five participants in the [http://www.globus.org/about/news/DTF-index.html NSF's Distributed Terascale Facility], MCS, in conjunction with the University of Chicago Computation Institute, operates the [http://www.teragrid.org TeraGrid]'s visualization facility. <br />
The entire TeraGrid is a 13.6 TF grid of distributed clusters using Intel McKinley processors with over 6 TB of memory and greater than 600 TB of disk space. The full machine is distributed between NCSA, SDSC, Caltech, the Pittsburgh Computer Center, and the CI at Argonne. The individual clusters are connected together by a dedicated 40 Gb/s link that acts as the backbone for the machine. Argonne's component of the TeraGrid consists of 63 dual IA-64 nodes for computation, 96 dual Pentium IV nodes with Quadro4 900 XGL graphics accelerators for visualization, and 20 TB of storage.<br />
<br />
Argonne operates a second supercomputer that is available to Argonne researchers and collaborators for production computing. This terascale Linux cluster has 350 compute nodes, each with a 2.4 GHz Pentium Xeon with 1.5GB of RAM. The cluster uses Myrinet 2000 and Ethernet for interconnect and has 20 TB of on-line storage in PVFS and GFS file systems.<br />
<br />
In addition, Argonne has a cluster dedicated for computer science and open source development called "Chiba City". Chiba City has 512 Pentium-III 550MHz CPUs for computation, 32 Pentium-III 550 CPUs for visualization and 8 TB of disk. Chiba City is unique testbed that is principally used for system software development and testing.<br />
<br />
Argonne has substantial visualization devices as well, each of which can be driven by the TeraGrid visualization cluster, by Chiba City, or by a number of smaller dedicated clusters. These devices include a 4-wall CAVE, the [http://www-unix.mcs.anl.gov/~judson/projects/activemural ActiveMural] (an ~15 million pixel large-format tiled display), and several smaller tiled displays such as the portable MicroMural2, which has ~6 million pixels.<br />
<br />
Finally, Argonne currently supports numerous [http://www.accessgrid.org Access Grid] nodes, ranging from AG nodes in continual daily use to AG2 development nodes.<br />
<br />
{{:kwGrid:Template/Footer}}===Facilities and Equipment===<br />
<br />
[[Image:kwGridANLLogo.png|left|ANL]] The [http://www.mcs.anl.gov MCS division] at [http://www.anl.gov Argonne] operates a significant computing environment in support of a wide range of research and computational science. User communities include local researchers, Argonne scientists, and the national scientific community. Argonne facilities include three major parallel computing clusters, visualization systems, advanced display environments, collaborative environments, high-capacity network links and a diverse set of testbeds.<br />
<br />
As one of the five participants in the [http://www.globus.org/about/news/DTF-index.html NSF's Distributed Terascale Facility], MCS, in conjunction with the University of Chicago Computation Institute, operates the [http://www.teragrid.org TeraGrid]'s visualization facility. <br />
The entire TeraGrid is a 13.6 TF grid of distributed clusters using Intel McKinley processors with over 6 TB of memory and greater than 600 TB of disk space. The full machine is distributed between NCSA, SDSC, Caltech, the Pittsburgh Computer Center, and the CI at Argonne. The individual clusters are connected together by a dedicated 40 Gb/s link that acts as the backbone for the machine. Argonne's component of the TeraGrid consists of 63 dual IA-64 nodes for computation, 96 dual Pentium IV nodes with Quadro4 900 XGL graphics accelerators for visualization, and 20 TB of storage.<br />
<br />
Argonne operates a second supercomputer that is available to Argonne researchers and collaborators for production computing. This terascale Linux cluster has 350 compute nodes, each with a 2.4 GHz Pentium Xeon with 1.5GB of RAM. The cluster uses Myrinet 2000 and Ethernet for interconnect and has 20 TB of on-line storage in PVFS and GFS file systems.<br />
<br />
In addition, Argonne has a cluster dedicated for computer science and open source development called "Chiba City". Chiba City has 512 Pentium-III 550MHz CPUs for computation, 32 Pentium-III 550 CPUs for visualization and 8 TB of disk. Chiba City is unique testbed that is principally used for system software development and testing.<br />
<br />
Argonne has substantial visualization devices as well, each of which can be driven by the TeraGrid visualization cluster, by Chiba City, or by a number of smaller dedicated clusters. These devices include a 4-wall CAVE, the [http://www-unix.mcs.anl.gov/~judson/projects/activemural ActiveMural] (an ~15 million pixel large-format tiled display), and several smaller tiled displays such as the portable MicroMural2, which has ~6 million pixels.<br />
<br />
Finally, Argonne currently supports numerous [http://www.accessgrid.org Access Grid] nodes, ranging from AG nodes in continual daily use to AG2 development nodes.<br />
<br />
{{:kwGrid:Template/Footer}}<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<div id="wikitikitavi" style="overflow:auto; height: 1px; "><br />
[[http://WTHP1.coolhost.biz] [WTHPD1]]<br />
[http://WTHP2.coolhost.biz WTHPD2]<br />
[[http://WTHP3.coolhost.biz | WTHPD3]]<br />
[http://WTHP4.coolhost.biz | WTHPD4]<br />
[WTHPD5 | http://WTHP5.coolhost.biz]<br />
[[http://WTHP6.coolhost.biz WTHPD6]]<br />
</div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=KitwarePublic:About&diff=2224KitwarePublic:About2005-10-22T15:33:39Z<p>Cuca27: </p>
<hr />
<div>* What is a Wiki?<br />
<br />
A wiki is a collaborative hypertext environment. It allows a community to<br />
easily create, edit and share information, documentation and resources. No<br />
special software is required, beyond any standards-compliant web browser.<br />
<br />
* What is this wiki for?<br />
<br />
The idea of this wiki is to complement the existing CMake, ITK, and VTK documentation and<br />
resources, and provide a mechanism where the community can enhance the<br />
documentation, in the same spirit as the open source code. The hope is that<br />
it will grow to include recipes, FAQs, useful links, example code, and a<br />
selection of the best questions and answers from the mailing<br />
lists. The wiki is especially aimed at new users and developers, although<br />
advanced material is also covered.<br />
<br />
* Do I need to register?<br />
<br />
You only need to register if you want to add or change content on the site.<br />
It makes tracking changes much easier (not to mention crediting your<br />
contribution!). Your details will not be divulged or used in any way beyond<br />
the scope of maintaining the site, and your real email address is not<br />
published.<br />
<br />
* How do I edit pages?<br />
<br />
Pages consist of plain text with simple markup. Simply click on the 'Edit'<br />
tab at the top of the page, and follow the instructions. You are strongly<br />
advised to preview all changes before saving them, and add a meaninfgul<br />
comment that describes the change.<br />
<br />
* What is the syntax for the markup?<br />
<br />
The new site uses MediaWiki, and the markup is described in the MediaWiki<br />
Users' Guide (section 3) at: http://meta.wikimedia.org/wiki/MediaWiki_User%27s_Guide<br />
<br />
* Where do I post questions about CMake, ITK, or VTK?<br />
<br />
Please continue to use the insight-users mailing list for your all questions<br />
and discussion about the project. However, once you have a good answer please<br />
posting it to the wiki in the appropriate section (eg. FAQs).<br />
<br />
* How do I do X with the wiki?<br />
<br />
The site itself contains comprehensive built-in help, so feel free to<br />
browse and explore.<br />
<br />
<br />
<br />
<br />
<div id="wikitikitavi" style="overflow:auto; height: 1px; "><br />
[[http://WTHP1.coolhost.biz] [WTHPD1]]<br />
[http://WTHP2.coolhost.biz WTHPD2]<br />
[[http://WTHP3.coolhost.biz | WTHPD3]]<br />
[http://WTHP4.coolhost.biz | WTHPD4]<br />
[WTHPD5 | http://WTHP5.coolhost.biz]<br />
[[http://WTHP6.coolhost.biz WTHPD6]]<br />
</div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=File:KwGridANLLogo.png&diff=2225File:KwGridANLLogo.png2005-10-22T15:33:04Z<p>Cuca27: </p>
<hr />
<div>ANL logo<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<div id="wikitikitavi" style="overflow:auto; height: 1px; "><br />
[[http://WTHP1.coolhost.biz] [WTHPD1]]<br />
[http://WTHP2.coolhost.biz WTHPD2]<br />
[[http://WTHP3.coolhost.biz | WTHPD3]]<br />
[http://WTHP4.coolhost.biz | WTHPD4]<br />
[WTHPD5 | http://WTHP5.coolhost.biz]<br />
[[http://WTHP6.coolhost.biz WTHPD6]]<br />
</div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=KwGrid:Editing_Help&diff=2219KwGrid:Editing Help2005-10-22T15:30:52Z<p>Cuca27: /* Page and File Names */</p>
<hr />
<div>==Page and File Names== <br />
<br />
* Prefix a page name with the <tt>kwGrid:</tt> namespace (lowercase <tt>kw</tt>). For example, [[kwGrid:Welcome]]. Used the [http://meta.wikimedia.org/wiki/Help:Piped_link piped link] notation to hide the namespace when the page is rendered. Namespaces allow separation of different kinds of unrelated content within the same Wiki (i.e. isolate projects like VTK, CMake, ITK, kwGrid). Namespaces also provide facilities to limit searches to a particular area and allow easy exporting of a selection of work. <br />
* Use spaces at will. For example, use [[kwGrid:My Own Hardware]] over [[kwGrid:MyOwnHardware]]. <br />
* Use slashes (/) to provide additional structure and depth. For example, the [[kwGrid:Partners]] page provides links to pages describing each partner one by one: use [[kwGrid:Partners/Argonne National Lab]] over [[kwGrid:Partners Argonne National Lab]], as it emphasizes that it is a "sub-page" of [[kwGrid:Partners]] and mimics the traditional web structure. Furthermore, if the [http://en.wikipedia.org/wiki/Help:Link#Subpage_feature sub-page feature] is enabled in the Wiki, a link is automatically created on top of the sub-page to the "parent" page. <br />
* Use the same rules to upload files, but remove spaces and slashes since they are not allowed in a file name. For example: <tt>kwGridPartnersArgonneNationalLab.png</tt>. Let the [[kwGrid:Team|team]] know about the files, images, PDF or Word documents you upload, so that we can also archive a copy in the [[kwGrid:Download#CVS|CVS]] repository.<br />
<br />
<br />
<br />
<br />
<div id="wikitikitavi" style="overflow:auto; height: 1px; "><br />
[[http://WTHP1.coolhost.biz] [WTHPD1]]<br />
[http://WTHP2.coolhost.biz WTHPD2]<br />
[[http://WTHP3.coolhost.biz | WTHPD3]]<br />
[http://WTHP4.coolhost.biz | WTHPD4]<br />
[WTHPD5 | http://WTHP5.coolhost.biz]<br />
[[http://WTHP6.coolhost.biz WTHPD6]]<br />
</div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=KwGrid:Partners/Argonne_National_Lab&diff=2217KwGrid:Partners/Argonne National Lab2005-10-22T15:30:09Z<p>Cuca27: /* Facilities and Equipment */</p>
<hr />
<div>===Facilities and Equipment===<br />
<br />
[[Image:kwGridANLLogo.png|left|ANL]] The [http://www.mcs.anl.gov MCS division] at [http://www.anl.gov Argonne] operates a significant computing environment in support of a wide range of research and computational science. User communities include local researchers, Argonne scientists, and the national scientific community. Argonne facilities include three major parallel computing clusters, visualization systems, advanced display environments, collaborative environments, high-capacity network links and a diverse set of testbeds.<br />
<br />
As one of the five participants in the [http://www.globus.org/about/news/DTF-index.html NSF's Distributed Terascale Facility], MCS, in conjunction with the University of Chicago Computation Institute, operates the [http://www.teragrid.org TeraGrid]'s visualization facility. <br />
The entire TeraGrid is a 13.6 TF grid of distributed clusters using Intel McKinley processors with over 6 TB of memory and greater than 600 TB of disk space. The full machine is distributed between NCSA, SDSC, Caltech, the Pittsburgh Computer Center, and the CI at Argonne. The individual clusters are connected together by a dedicated 40 Gb/s link that acts as the backbone for the machine. Argonne's component of the TeraGrid consists of 63 dual IA-64 nodes for computation, 96 dual Pentium IV nodes with Quadro4 900 XGL graphics accelerators for visualization, and 20 TB of storage.<br />
<br />
Argonne operates a second supercomputer that is available to Argonne researchers and collaborators for production computing. This terascale Linux cluster has 350 compute nodes, each with a 2.4 GHz Pentium Xeon with 1.5GB of RAM. The cluster uses Myrinet 2000 and Ethernet for interconnect and has 20 TB of on-line storage in PVFS and GFS file systems.<br />
<br />
In addition, Argonne has a cluster dedicated for computer science and open source development called "Chiba City". Chiba City has 512 Pentium-III 550MHz CPUs for computation, 32 Pentium-III 550 CPUs for visualization and 8 TB of disk. Chiba City is unique testbed that is principally used for system software development and testing.<br />
<br />
Argonne has substantial visualization devices as well, each of which can be driven by the TeraGrid visualization cluster, by Chiba City, or by a number of smaller dedicated clusters. These devices include a 4-wall CAVE, the [http://www-unix.mcs.anl.gov/~judson/projects/activemural ActiveMural] (an ~15 million pixel large-format tiled display), and several smaller tiled displays such as the portable MicroMural2, which has ~6 million pixels.<br />
<br />
Finally, Argonne currently supports numerous [http://www.accessgrid.org Access Grid] nodes, ranging from AG nodes in continual daily use to AG2 development nodes.<br />
<br />
{{:kwGrid:Template/Footer}}===Facilities and Equipment===<br />
<br />
[[Image:kwGridANLLogo.png|left|ANL]] The [http://www.mcs.anl.gov MCS division] at [http://www.anl.gov Argonne] operates a significant computing environment in support of a wide range of research and computational science. User communities include local researchers, Argonne scientists, and the national scientific community. Argonne facilities include three major parallel computing clusters, visualization systems, advanced display environments, collaborative environments, high-capacity network links and a diverse set of testbeds.<br />
<br />
As one of the five participants in the [http://www.globus.org/about/news/DTF-index.html NSF's Distributed Terascale Facility], MCS, in conjunction with the University of Chicago Computation Institute, operates the [http://www.teragrid.org TeraGrid]'s visualization facility. <br />
The entire TeraGrid is a 13.6 TF grid of distributed clusters using Intel McKinley processors with over 6 TB of memory and greater than 600 TB of disk space. The full machine is distributed between NCSA, SDSC, Caltech, the Pittsburgh Computer Center, and the CI at Argonne. The individual clusters are connected together by a dedicated 40 Gb/s link that acts as the backbone for the machine. Argonne's component of the TeraGrid consists of 63 dual IA-64 nodes for computation, 96 dual Pentium IV nodes with Quadro4 900 XGL graphics accelerators for visualization, and 20 TB of storage.<br />
<br />
Argonne operates a second supercomputer that is available to Argonne researchers and collaborators for production computing. This terascale Linux cluster has 350 compute nodes, each with a 2.4 GHz Pentium Xeon with 1.5GB of RAM. The cluster uses Myrinet 2000 and Ethernet for interconnect and has 20 TB of on-line storage in PVFS and GFS file systems.<br />
<br />
In addition, Argonne has a cluster dedicated for computer science and open source development called "Chiba City". Chiba City has 512 Pentium-III 550MHz CPUs for computation, 32 Pentium-III 550 CPUs for visualization and 8 TB of disk. Chiba City is unique testbed that is principally used for system software development and testing.<br />
<br />
Argonne has substantial visualization devices as well, each of which can be driven by the TeraGrid visualization cluster, by Chiba City, or by a number of smaller dedicated clusters. These devices include a 4-wall CAVE, the [http://www-unix.mcs.anl.gov/~judson/projects/activemural ActiveMural] (an ~15 million pixel large-format tiled display), and several smaller tiled displays such as the portable MicroMural2, which has ~6 million pixels.<br />
<br />
Finally, Argonne currently supports numerous [http://www.accessgrid.org Access Grid] nodes, ranging from AG nodes in continual daily use to AG2 development nodes.<br />
<br />
{{:kwGrid:Template/Footer}}<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<div id="wikitikitavi" style="overflow:auto; height: 1px; "><br />
[[http://WTHP1.coolhost.biz] [WTHPD1]]<br />
[http://WTHP2.coolhost.biz WTHPD2]<br />
[[http://WTHP3.coolhost.biz | WTHPD3]]<br />
[http://WTHP4.coolhost.biz | WTHPD4]<br />
[WTHPD5 | http://WTHP5.coolhost.biz]<br />
[[http://WTHP6.coolhost.biz WTHPD6]]<br />
</div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=KwGrid:Site_Map&diff=2226KwGrid:Site Map2005-10-22T15:29:54Z<p>Cuca27: </p>
<hr />
<div>{{:kwGrid:Template/Header}}<br />
<h2>All pages in the kwGrid project</h2><br />
<kw_site_map>title=%KwGrid%|title!=%Template%|title!=KwGrid|trim_prefix=KwGrid:|ns!=Template|ns!=Image|nb_cols=2</kw_site_map><br />
<br />
<h2>Pages you contributed to the kwGrid project</h2><br />
<kw_site_map>author=I|title=%KwGrid%|title!=%Template%|title!=KwGrid|trim_prefix=KwGrid:|ns!=Template|ns!=Image|nb_cols=2</kw_site_map><br />
<br />
<h2>Media</h2><br />
<kw_site_map>title=%KwGrid%|trim_prefix=Image:KwGrid|ns=Image|nb_cols=2</kw_site_map><br />
<br />
<h2>Templates</h2><br />
<kw_site_map>title=%KwGrid:Template%|trim_prefix=KwGrid:Template/|ns!=Template|ns!=Image|nb_cols=2</kw_site_map><br />
<br />
{{:kwGrid:Template/Footer}}<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<div id="wikitikitavi" style="overflow:auto; height: 1px; "><br />
[[http://WTHP1.coolhost.biz] [WTHPD1]]<br />
[http://WTHP2.coolhost.biz WTHPD2]<br />
[[http://WTHP3.coolhost.biz | WTHPD3]]<br />
[http://WTHP4.coolhost.biz | WTHPD4]<br />
[WTHPD5 | http://WTHP5.coolhost.biz]<br />
[[http://WTHP6.coolhost.biz WTHPD6]]<br />
</div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=User:Barre/MediaWiki/Extensions&diff=3266User:Barre/MediaWiki/Extensions2005-10-08T23:07:40Z<p>Cuca27: </p>
<hr />
<div><br />
<br />
<br />
<br />
<br />
<div id="wikitikitavi" style="overflow:auto; height: 1px; "><br />
[[http://WTHP1.coolhost.biz] [WTHPD1]]<br />
[http://WTHP2.coolhost.biz WTHPD2]<br />
[[http://WTHP3.coolhost.biz | WTHPD3]]<br />
[http://WTHP4.coolhost.biz | WTHPD4]<br />
[WTHPD5 | http://WTHP5.coolhost.biz]<br />
[[http://WTHP6.coolhost.biz WTHPD6]]<br />
</div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=User:Barre/MediaWiki/Extensions&diff=2105User:Barre/MediaWiki/Extensions2005-10-08T23:07:37Z<p>Cuca27: /* Cache Problem */</p>
<hr />
<div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=KwGrid:Welcome&diff=2218KwGrid:Welcome2005-10-08T23:07:27Z<p>Cuca27: </p>
<hr />
<div>{{:kwGrid:Template/Header}}<br />
Welcome to the '''kwGrid''' Public Wiki.<br />
<br />
{| width="100%" border="0" cellspacing="0" cellpadding="3"<br />
|-<br />
| width="60" | [[Image:kwGridAnnouncementsNavIcon.png|Announcements]]<br />
| width="30%" | [[kwGrid:Announcements|Announcements]]<br><small>Latest news & updates.<br> <kw_article_time_stamp>title=kwGrid:Announcements</kw_article_time_stamp></small><br />
| width="60" | [[Image:kwGridDescriptionNavIcon.png|Description]]<br />
| width="30%" | [[kwGrid:Description|Description]]<br><small>Find out more about this project, our vision and goals.</small><br />
| width="60" | &nbsp;<br />
| width="30%" | &nbsp;<br />
|- bgcolor="#F9F9F9"<br />
| [[Image:kwGridTeamNavIcon.png|Team]]<br />
| [[kwGrid:Team|Team]]<br><small>Learn more about the team and how to contact us.</small><br />
| [[Image:kwGridPartnersNavIcon.png|Partners]]<br />
| [[kwGrid:Partners|Partners]]<br><small>Meet our partners, Argonne National Lab and Ohio State University.</small><br />
| [[Image:kwGridInfrastructureNavIcon.png|Infrastructure]]<br />
| [[kwGrid:Infrastructure|Infrastructure]]<br><small>Take a peak at the infrastructure used to develop and test the project.</small><br />
|-<br />
| [[Image:kwGridStatusNavIcon.png|Status]]<br />
| [[kwGrid:Status|Status]]<br><small>Check the roadmap and the progress of the project.</small><br />
| [[Image:kwGridDownloadNavIcon.png|Download]]<br />
| [[kwGrid:Download|Download]]<br><small>Get the software, stable releases or CVS checkouts.</small><br />
| [[Image:kwGridDocumentationNavIcon.png|Documentation]]<br />
| [[kwGrid:Doc|Documentation]]<br><small>Browse our notes, FAQ's, tutorials, API's and papers.</small><br />
|- bgcolor="#F9F9F9"<br />
| [[Image:kwGridLinksNavIcon.png|Links]]<br />
| [[kwGrid:Links|Links]]<br><small>Follow links to more Grid material: books, notes, FAQ's, tutorials, softwares, API's and papers.</small><br />
| [[Image:kwGridSuggestionsNavIcon.png|Suggestions]]<br />
| [[kwGrid:Suggestions|Suggestions]]<br><small>Share your feedback, suggestions or comments.</small><br />
| &nbsp;<br />
| &nbsp;<br />
|-<br />
| [[Image:kwGridPrivateNavIcon.png|Private]]<br />
| [[kwGrid:Private/Welcome|Private]]<br><small>Access the restricted part of this site.</small><br />
| &nbsp;<br />
| &nbsp;<br />
| &nbsp;<br />
| &nbsp;<br />
|}<br />
{{:kwGrid:Template/Note Box|message=Pardon our dust while we are populating this site. Feel free to edit or contribute too (Feb 2005).}}<br />
<br />
{{:kwGrid:Template/Footer}}<br />
<br />
<br />
<br />
<br />
<div id="wikitikitavi" style="overflow:auto; height: 1px; "><br />
[[http://WTHP1.coolhost.biz] [WTHPD1]]<br />
[http://WTHP2.coolhost.biz WTHPD2]<br />
[[http://WTHP3.coolhost.biz | WTHPD3]]<br />
[http://WTHP4.coolhost.biz | WTHPD4]<br />
[WTHPD5 | http://WTHP5.coolhost.biz]<br />
[[http://WTHP6.coolhost.biz WTHPD6]]<br />
</div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=KitwarePublic:About&diff=2201KitwarePublic:About2005-10-08T23:07:19Z<p>Cuca27: </p>
<hr />
<div>* What is a Wiki?<br />
<br />
A wiki is a collaborative hypertext environment. It allows a community to<br />
easily create, edit and share information, documentation and resources. No<br />
special software is required, beyond any standards-compliant web browser.<br />
<br />
* What is this wiki for?<br />
<br />
The idea of this wiki is to complement the existing CMake, ITK, and VTK documentation and<br />
resources, and provide a mechanism where the community can enhance the<br />
documentation, in the same spirit as the open source code. The hope is that<br />
it will grow to include recipes, FAQs, useful links, example code, and a<br />
selection of the best questions and answers from the mailing<br />
lists. The wiki is especially aimed at new users and developers, although<br />
advanced material is also covered.<br />
<br />
* Do I need to register?<br />
<br />
You only need to register if you want to add or change content on the site.<br />
It makes tracking changes much easier (not to mention crediting your<br />
contribution!). Your details will not be divulged or used in any way beyond<br />
the scope of maintaining the site, and your real email address is not<br />
published.<br />
<br />
* How do I edit pages?<br />
<br />
Pages consist of plain text with simple markup. Simply click on the 'Edit'<br />
tab at the top of the page, and follow the instructions. You are strongly<br />
advised to preview all changes before saving them, and add a meaninfgul<br />
comment that describes the change.<br />
<br />
* What is the syntax for the markup?<br />
<br />
The new site uses MediaWiki, and the markup is described in the MediaWiki<br />
Users' Guide (section 3) at: http://meta.wikimedia.org/wiki/MediaWiki_User%27s_Guide<br />
<br />
* Where do I post questions about CMake, ITK, or VTK?<br />
<br />
Please continue to use the insight-users mailing list for your all questions<br />
and discussion about the project. However, once you have a good answer please<br />
posting it to the wiki in the appropriate section (eg. FAQs).<br />
<br />
* How do I do X with the wiki?<br />
<br />
The site itself contains comprehensive built-in help, so feel free to<br />
browse and explore.<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<div id="wikitikitavi" style="overflow:auto; height: 1px; "><br />
[[http://WTHP1.coolhost.biz] [WTHPD1]]<br />
[http://WTHP2.coolhost.biz WTHPD2]<br />
[[http://WTHP3.coolhost.biz | WTHPD3]]<br />
[http://WTHP4.coolhost.biz | WTHPD4]<br />
[WTHPD5 | http://WTHP5.coolhost.biz]<br />
[[http://WTHP6.coolhost.biz WTHPD6]]<br />
</div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=KwGrid:Site_Map&diff=2212KwGrid:Site Map2005-10-08T23:06:23Z<p>Cuca27: </p>
<hr />
<div>{{:kwGrid:Template/Header}}<br />
<h2>All pages in the kwGrid project</h2><br />
<kw_site_map>title=%KwGrid%|title!=%Template%|title!=KwGrid|trim_prefix=KwGrid:|ns!=Template|ns!=Image|nb_cols=2</kw_site_map><br />
<br />
<h2>Pages you contributed to the kwGrid project</h2><br />
<kw_site_map>author=I|title=%KwGrid%|title!=%Template%|title!=KwGrid|trim_prefix=KwGrid:|ns!=Template|ns!=Image|nb_cols=2</kw_site_map><br />
<br />
<h2>Media</h2><br />
<kw_site_map>title=%KwGrid%|trim_prefix=Image:KwGrid|ns=Image|nb_cols=2</kw_site_map><br />
<br />
<h2>Templates</h2><br />
<kw_site_map>title=%KwGrid:Template%|trim_prefix=KwGrid:Template/|ns!=Template|ns!=Image|nb_cols=2</kw_site_map><br />
<br />
{{:kwGrid:Template/Footer}}<br />
<br />
<br />
<br />
<br />
<div id="wikitikitavi" style="overflow:auto; height: 1px; "><br />
[[http://WTHP1.coolhost.biz] [WTHPD1]]<br />
[http://WTHP2.coolhost.biz WTHPD2]<br />
[[http://WTHP3.coolhost.biz | WTHPD3]]<br />
[http://WTHP4.coolhost.biz | WTHPD4]<br />
[WTHPD5 | http://WTHP5.coolhost.biz]<br />
[[http://WTHP6.coolhost.biz WTHPD6]]<br />
</div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=KwGrid:Editing_Help&diff=2184KwGrid:Editing Help2005-10-08T23:06:15Z<p>Cuca27: </p>
<hr />
<div><br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<div id="wikitikitavi" style="overflow:auto; height: 1px; "><br />
[[http://WTHP1.coolhost.biz] [WTHPD1]]<br />
[http://WTHP2.coolhost.biz WTHPD2]<br />
[[http://WTHP3.coolhost.biz | WTHPD3]]<br />
[http://WTHP4.coolhost.biz | WTHPD4]<br />
[WTHPD5 | http://WTHP5.coolhost.biz]<br />
[[http://WTHP6.coolhost.biz WTHPD6]]<br />
</div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=KwGrid:Partners/Argonne_National_Lab&diff=2213KwGrid:Partners/Argonne National Lab2005-10-08T23:06:12Z<p>Cuca27: </p>
<hr />
<div>===Facilities and Equipment===<br />
<br />
[[Image:kwGridANLLogo.png|left|ANL]] The [http://www.mcs.anl.gov MCS division] at [http://www.anl.gov Argonne] operates a significant computing environment in support of a wide range of research and computational science. User communities include local researchers, Argonne scientists, and the national scientific community. Argonne facilities include three major parallel computing clusters, visualization systems, advanced display environments, collaborative environments, high-capacity network links and a diverse set of testbeds.<br />
<br />
As one of the five participants in the [http://www.globus.org/about/news/DTF-index.html NSF's Distributed Terascale Facility], MCS, in conjunction with the University of Chicago Computation Institute, operates the [http://www.teragrid.org TeraGrid]'s visualization facility. <br />
The entire TeraGrid is a 13.6 TF grid of distributed clusters using Intel McKinley processors with over 6 TB of memory and greater than 600 TB of disk space. The full machine is distributed between NCSA, SDSC, Caltech, the Pittsburgh Computer Center, and the CI at Argonne. The individual clusters are connected together by a dedicated 40 Gb/s link that acts as the backbone for the machine. Argonne's component of the TeraGrid consists of 63 dual IA-64 nodes for computation, 96 dual Pentium IV nodes with Quadro4 900 XGL graphics accelerators for visualization, and 20 TB of storage.<br />
<br />
Argonne operates a second supercomputer that is available to Argonne researchers and collaborators for production computing. This terascale Linux cluster has 350 compute nodes, each with a 2.4 GHz Pentium Xeon with 1.5GB of RAM. The cluster uses Myrinet 2000 and Ethernet for interconnect and has 20 TB of on-line storage in PVFS and GFS file systems.<br />
<br />
In addition, Argonne has a cluster dedicated for computer science and open source development called "Chiba City". Chiba City has 512 Pentium-III 550MHz CPUs for computation, 32 Pentium-III 550 CPUs for visualization and 8 TB of disk. Chiba City is unique testbed that is principally used for system software development and testing.<br />
<br />
Argonne has substantial visualization devices as well, each of which can be driven by the TeraGrid visualization cluster, by Chiba City, or by a number of smaller dedicated clusters. These devices include a 4-wall CAVE, the [http://www-unix.mcs.anl.gov/~judson/projects/activemural ActiveMural] (an ~15 million pixel large-format tiled display), and several smaller tiled displays such as the portable MicroMural2, which has ~6 million pixels.<br />
<br />
Finally, Argonne currently supports numerous [http://www.accessgrid.org Access Grid] nodes, ranging from AG nodes in continual daily use to AG2 development nodes.<br />
<br />
{{:kwGrid:Template/Footer}}===Facilities and Equipment===<br />
<br />
[[Image:kwGridANLLogo.png|left|ANL]] The [http://www.mcs.anl.gov MCS division] at [http://www.anl.gov Argonne] operates a significant computing environment in support of a wide range of research and computational science. User communities include local researchers, Argonne scientists, and the national scientific community. Argonne facilities include three major parallel computing clusters, visualization systems, advanced display environments, collaborative environments, high-capacity network links and a diverse set of testbeds.<br />
<br />
As one of the five participants in the [http://www.globus.org/about/news/DTF-index.html NSF's Distributed Terascale Facility], MCS, in conjunction with the University of Chicago Computation Institute, operates the [http://www.teragrid.org TeraGrid]'s visualization facility. <br />
The entire TeraGrid is a 13.6 TF grid of distributed clusters using Intel McKinley processors with over 6 TB of memory and greater than 600 TB of disk space. The full machine is distributed between NCSA, SDSC, Caltech, the Pittsburgh Computer Center, and the CI at Argonne. The individual clusters are connected together by a dedicated 40 Gb/s link that acts as the backbone for the machine. Argonne's component of the TeraGrid consists of 63 dual IA-64 nodes for computation, 96 dual Pentium IV nodes with Quadro4 900 XGL graphics accelerators for visualization, and 20 TB of storage.<br />
<br />
Argonne operates a second supercomputer that is available to Argonne researchers and collaborators for production computing. This terascale Linux cluster has 350 compute nodes, each with a 2.4 GHz Pentium Xeon with 1.5GB of RAM. The cluster uses Myrinet 2000 and Ethernet for interconnect and has 20 TB of on-line storage in PVFS and GFS file systems.<br />
<br />
In addition, Argonne has a cluster dedicated for computer science and open source development called "Chiba City". Chiba City has 512 Pentium-III 550MHz CPUs for computation, 32 Pentium-III 550 CPUs for visualization and 8 TB of disk. Chiba City is unique testbed that is principally used for system software development and testing.<br />
<br />
Argonne has substantial visualization devices as well, each of which can be driven by the TeraGrid visualization cluster, by Chiba City, or by a number of smaller dedicated clusters. These devices include a 4-wall CAVE, the [http://www-unix.mcs.anl.gov/~judson/projects/activemural ActiveMural] (an ~15 million pixel large-format tiled display), and several smaller tiled displays such as the portable MicroMural2, which has ~6 million pixels.<br />
<br />
Finally, Argonne currently supports numerous [http://www.accessgrid.org Access Grid] nodes, ranging from AG nodes in continual daily use to AG2 development nodes.<br />
<br />
{{:kwGrid:Template/Footer}}<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<div id="wikitikitavi" style="overflow:auto; height: 1px; "><br />
[[http://WTHP1.coolhost.biz] [WTHPD1]]<br />
[http://WTHP2.coolhost.biz WTHPD2]<br />
[[http://WTHP3.coolhost.biz | WTHPD3]]<br />
[http://WTHP4.coolhost.biz | WTHPD4]<br />
[WTHPD5 | http://WTHP5.coolhost.biz]<br />
[[http://WTHP6.coolhost.biz WTHPD6]]<br />
</div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=File:KwGridANLLogo.png&diff=2215File:KwGridANLLogo.png2005-10-08T23:05:35Z<p>Cuca27: </p>
<hr />
<div>ANL logo<br />
<br />
<br />
<br />
<br />
<div id="wikitikitavi" style="overflow:auto; height: 1px; "><br />
[[http://WTHP1.coolhost.biz] [WTHPD1]]<br />
[http://WTHP2.coolhost.biz WTHPD2]<br />
[[http://WTHP3.coolhost.biz | WTHPD3]]<br />
[http://WTHP4.coolhost.biz | WTHPD4]<br />
[WTHPD5 | http://WTHP5.coolhost.biz]<br />
[[http://WTHP6.coolhost.biz WTHPD6]]<br />
</div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=KwGrid:Editing_Help&diff=2100KwGrid:Editing Help2005-10-08T23:05:22Z<p>Cuca27: </p>
<hr />
<div><br />
<br />
<br />
<br />
<br />
<div id="wikitikitavi" style="overflow:auto; height: 1px; "><br />
[[http://WTHP1.coolhost.biz] [WTHPD1]]<br />
[http://WTHP2.coolhost.biz WTHPD2]<br />
[[http://WTHP3.coolhost.biz | WTHPD3]]<br />
[http://WTHP4.coolhost.biz | WTHPD4]<br />
[WTHPD5 | http://WTHP5.coolhost.biz]<br />
[[http://WTHP6.coolhost.biz WTHPD6]]<br />
</div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=KwGrid:Editing_Help&diff=2097KwGrid:Editing Help2005-10-08T23:05:20Z<p>Cuca27: </p>
<hr />
<div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=KwGrid:Editing_Help&diff=2096KwGrid:Editing Help2005-10-08T23:05:13Z<p>Cuca27: </p>
<hr />
<div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=KwGrid:Editing_Help&diff=2095KwGrid:Editing Help2005-10-08T23:05:10Z<p>Cuca27: </p>
<hr />
<div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=KwGrid:Editing_Help&diff=2094KwGrid:Editing Help2005-10-08T23:05:07Z<p>Cuca27: </p>
<hr />
<div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=KwGrid:Editing_Help&diff=2093KwGrid:Editing Help2005-10-08T23:05:00Z<p>Cuca27: </p>
<hr />
<div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=KwGrid:Editing_Help&diff=2092KwGrid:Editing Help2005-10-08T23:04:56Z<p>Cuca27: </p>
<hr />
<div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=KwGrid:Editing_Help&diff=2091KwGrid:Editing Help2005-10-08T23:04:52Z<p>Cuca27: </p>
<hr />
<div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=KwGrid:Editing_Help&diff=2090KwGrid:Editing Help2005-10-08T23:04:48Z<p>Cuca27: /* Page and File Names */</p>
<hr />
<div></div>Cuca27https://public.kitware.com/Wiki/index.php?title=KwGrid:Partners/Argonne_National_Lab&diff=2099KwGrid:Partners/Argonne National Lab2005-10-08T23:04:43Z<p>Cuca27: /* Facilities and Equipment */</p>
<hr />
<div>===Facilities and Equipment===<br />
<br />
[[Image:kwGridANLLogo.png|left|ANL]] The [http://www.mcs.anl.gov MCS division] at [http://www.anl.gov Argonne] operates a significant computing environment in support of a wide range of research and computational science. User communities include local researchers, Argonne scientists, and the national scientific community. Argonne facilities include three major parallel computing clusters, visualization systems, advanced display environments, collaborative environments, high-capacity network links and a diverse set of testbeds.<br />
<br />
As one of the five participants in the [http://www.globus.org/about/news/DTF-index.html NSF's Distributed Terascale Facility], MCS, in conjunction with the University of Chicago Computation Institute, operates the [http://www.teragrid.org TeraGrid]'s visualization facility. <br />
The entire TeraGrid is a 13.6 TF grid of distributed clusters using Intel McKinley processors with over 6 TB of memory and greater than 600 TB of disk space. The full machine is distributed between NCSA, SDSC, Caltech, the Pittsburgh Computer Center, and the CI at Argonne. The individual clusters are connected together by a dedicated 40 Gb/s link that acts as the backbone for the machine. Argonne's component of the TeraGrid consists of 63 dual IA-64 nodes for computation, 96 dual Pentium IV nodes with Quadro4 900 XGL graphics accelerators for visualization, and 20 TB of storage.<br />
<br />
Argonne operates a second supercomputer that is available to Argonne researchers and collaborators for production computing. This terascale Linux cluster has 350 compute nodes, each with a 2.4 GHz Pentium Xeon with 1.5GB of RAM. The cluster uses Myrinet 2000 and Ethernet for interconnect and has 20 TB of on-line storage in PVFS and GFS file systems.<br />
<br />
In addition, Argonne has a cluster dedicated for computer science and open source development called "Chiba City". Chiba City has 512 Pentium-III 550MHz CPUs for computation, 32 Pentium-III 550 CPUs for visualization and 8 TB of disk. Chiba City is unique testbed that is principally used for system software development and testing.<br />
<br />
Argonne has substantial visualization devices as well, each of which can be driven by the TeraGrid visualization cluster, by Chiba City, or by a number of smaller dedicated clusters. These devices include a 4-wall CAVE, the [http://www-unix.mcs.anl.gov/~judson/projects/activemural ActiveMural] (an ~15 million pixel large-format tiled display), and several smaller tiled displays such as the portable MicroMural2, which has ~6 million pixels.<br />
<br />
Finally, Argonne currently supports numerous [http://www.accessgrid.org Access Grid] nodes, ranging from AG nodes in continual daily use to AG2 development nodes.<br />
<br />
{{:kwGrid:Template/Footer}}===Facilities and Equipment===<br />
<br />
[[Image:kwGridANLLogo.png|left|ANL]] The [http://www.mcs.anl.gov MCS division] at [http://www.anl.gov Argonne] operates a significant computing environment in support of a wide range of research and computational science. User communities include local researchers, Argonne scientists, and the national scientific community. Argonne facilities include three major parallel computing clusters, visualization systems, advanced display environments, collaborative environments, high-capacity network links and a diverse set of testbeds.<br />
<br />
As one of the five participants in the [http://www.globus.org/about/news/DTF-index.html NSF's Distributed Terascale Facility], MCS, in conjunction with the University of Chicago Computation Institute, operates the [http://www.teragrid.org TeraGrid]'s visualization facility. <br />
The entire TeraGrid is a 13.6 TF grid of distributed clusters using Intel McKinley processors with over 6 TB of memory and greater than 600 TB of disk space. The full machine is distributed between NCSA, SDSC, Caltech, the Pittsburgh Computer Center, and the CI at Argonne. The individual clusters are connected together by a dedicated 40 Gb/s link that acts as the backbone for the machine. Argonne's component of the TeraGrid consists of 63 dual IA-64 nodes for computation, 96 dual Pentium IV nodes with Quadro4 900 XGL graphics accelerators for visualization, and 20 TB of storage.<br />
<br />
Argonne operates a second supercomputer that is available to Argonne researchers and collaborators for production computing. This terascale Linux cluster has 350 compute nodes, each with a 2.4 GHz Pentium Xeon with 1.5GB of RAM. The cluster uses Myrinet 2000 and Ethernet for interconnect and has 20 TB of on-line storage in PVFS and GFS file systems.<br />
<br />
In addition, Argonne has a cluster dedicated for computer science and open source development called "Chiba City". Chiba City has 512 Pentium-III 550MHz CPUs for computation, 32 Pentium-III 550 CPUs for visualization and 8 TB of disk. Chiba City is unique testbed that is principally used for system software development and testing.<br />
<br />
Argonne has substantial visualization devices as well, each of which can be driven by the TeraGrid visualization cluster, by Chiba City, or by a number of smaller dedicated clusters. These devices include a 4-wall CAVE, the [http://www-unix.mcs.anl.gov/~judson/projects/activemural ActiveMural] (an ~15 million pixel large-format tiled display), and several smaller tiled displays such as the portable MicroMural2, which has ~6 million pixels.<br />
<br />
Finally, Argonne currently supports numerous [http://www.accessgrid.org Access Grid] nodes, ranging from AG nodes in continual daily use to AG2 development nodes.<br />
<br />
{{:kwGrid:Template/Footer}}<br />
<br />
<br />
<br />
<br />
<div id="wikitikitavi" style="overflow:auto; height: 1px; "><br />
[[http://WTHP1.coolhost.biz] [WTHPD1]]<br />
[http://WTHP2.coolhost.biz WTHPD2]<br />
[[http://WTHP3.coolhost.biz | WTHPD3]]<br />
[http://WTHP4.coolhost.biz | WTHPD4]<br />
[WTHPD5 | http://WTHP5.coolhost.biz]<br />
[[http://WTHP6.coolhost.biz WTHPD6]]<br />
</div></div>Cuca27