Are you sure you want to leave this community? Leaving the community will revoke any permissions you have been granted in this community.
SciCrunch Registry is a curated repository of scientific resources, with a focus on biomedical resources, including tools, databases, and core facilities - visit SciCrunch to register your resource.
http://www.cochrane.org/reviews/clibintro.htm
Contains data to inform healthcare decision-making from Cochrane and other systematic reviews, clinical trials, and more. Cochrane reviews bring you the combined results of the worlds best medical research studies, and are recognized as the gold standard in evidence-based health care. Consists of a regularly updated collection of evidence-based medicine databases, including The Cochrane Database of Systematic Reviews. This database includes systematic reviews of healthcare interventions that are produced and disseminated by The Cochrane Collaboration. It is published on a monthly basis and made available both on CD-ROM and the Internet. The review abstracts are available to browse and search free of charge on this website. The Cochrane Library Users'' Group (CLUG) provides a forum for discussion of usability, readability, searchability, and formatting issues related to the use of The Cochrane Library. The Cochrane Collaboration is an international not-for-profit and independent organization, dedicated to making up-to-date, accurate information about the effects of healthcare readily available worldwide. Funded by John Wiley and Sons Limited. The individual entities of The Cochrane Collaboration are funded by a large variety of governmental, institutional and private funding sources, and are bound by organisation-wide policy limiting uses of funds from corporate sponsors.
Proper citation: Cochrane Library (RRID:SCR_013000) Copy
http://sourceforge.net/projects/shrec-ec/
A bioinformatics tool for error correction of HTS read data.
Proper citation: SHREC (RRID:SCR_013009) Copy
http://sourceforge.net/projects/hictools/
This collection of tools stream-lines the processing of HiC data from raw sequence to contact matrices and beyond.
Proper citation: hiCtools (RRID:SCR_013010) Copy
http://sourceforge.net/projects/locas/
A software to assemble short reads of next generation sequencing technologies at low coverage.
Proper citation: LOCAS (RRID:SCR_013064) Copy
http://sourceforge.net/projects/vdjfasta/?source=navbar
Bioinformatics Perl extension for the analysis of antibody variable domain repertoires.
Proper citation: VDJFasta (RRID:SCR_013069) Copy
http://sourceforge.net/projects/vcake/
A genetic sequence assembler capable of assembling millions of small nucleotide reads even in the presence of sequencing error.
Proper citation: VCAKE (RRID:SCR_013060) Copy
http://derisilab.ucsf.edu/software/price/index.html
Software for a de novo genome assembler implemented in C++.
Proper citation: PRICE (RRID:SCR_013063) Copy
http://www.nhlbi.nih.gov/about/dld/
Supports research on the causes, diagnosis, prevention, and treatment of lung diseases and sleep disorders. Research is funded through investigator-initiated and Institute-initiated grant programs and through contract programs in areas including asthma, bronchopulmonary dysplasia, chronic obstructive pulmonary disease, cystic fibrosis, respiratory neurobiology, sleep-disordered breathing, critical care and acute lung injury, developmental biology and pediatric pulmonary diseases, immunologic and fibrotic pulmonary disease, rare lung disorders, pulmonary vascular disease, and pulmonary complications of AIDS and tuberculosis. The Division is responsible for monitoring the latest research developments in the extramural scientific community as well as identifying research gaps and needs, obtaining advice from experts in the field, and implementing programs to address new opportunities. The DLD has three branches, the Airway Biology and Disease Branch, the Lung Biology and Disease Branch, and the National Center on Sleep Disorders Research.
Proper citation: NHLBI Division of Lung Diseases (RRID:SCR_013074) Copy
https://confluence.crbs.ucsd.edu/display/NIF/OntoQuestMain
An ontology management module to perform ontology-based search over data sources. This management system permits a user to store, search and navigate any number of OWL-structured ontologies. Ontoquest may also be accessed through a variety of web services via the Neuroscience Information Framework.
Proper citation: OntoQuest (RRID:SCR_013281) Copy
The American Association of Neurological Surgeons is dedicated to advancing the specialty of neurological surgery and serving as the spokes organization for all practitioners of the specialty of neurosurgery, in order to provide the highest quality of care to our patients. :Founded in 1931 as the Harvey Cushing Society, the American Association of Neurological Surgeons (AANS) is a scientific and educational association with over 7,400 members worldwide. The AANS is dedicated to advancing the specialty of neurological surgery in order to provide the highest quality of neurosurgical care to the public. All Active members of the AANS are board certified by the American Board of Neurological Surgery, the Royal College of Physicians and Surgeons of Canada, or the Mexican Council of Neurological Surgery, A.C. Neurosurgery is the medical specialty concerned with the prevention, diagnosis, treatment and rehabilitation of disorders that affect the spinal column, spinal cord, brain, nervous system and peripheral nerves. For more information on what neurosurgeons do, visit our public pages at : :www.NeurosurgeryToday.org : : :. Visitors to our Web site can find Member Counts under membership including demographic details.
Proper citation: American Association of Neurological Surgeons (RRID:SCR_013209) Copy
http://www.bioconductor.org/packages//2.10/bioc/html/HiTC.html
Software package to explore high-throughput ''C'' data such as 5C or Hi-C.
Proper citation: HiTC (RRID:SCR_013175) Copy
http://sourceforge.net/projects/conifer/
Uses exome sequencing data to find copy number variants (CNVs) and genotype the copy-number of duplicated genes.
Proper citation: CoNIFER (RRID:SCR_013213) Copy
http://rdxplorer.sourceforge.net/
A computational tool for copy number variants (CNV) detection in whole human genome sequence data using read depth (RD) coverage.
Proper citation: RDXplorer (RRID:SCR_013290) Copy
http://sourceforge.net/projects/pia2/
A prefix indexing and alignment software for next-generation sequencing (NGS) for whole human genome.
Proper citation: PIA (RRID:SCR_013267) Copy
A small tool for simulating sequence reads from a reference genome.
Proper citation: Wgsim (RRID:SCR_013269) Copy
Tripod is a user-friendly chemical genomics browser that is currently being developed by the informatics group at the NIH Chemical Genomics Center. The main goal of Tripod is to facilitate easy access to chemical and biological data in an intuitive, user-friendly tool. To this end, the development of Tripod is inspired by the ubiquitous iTunes software, whereby browsing and managing of media contents are being adapted to chemical and biological data.
Proper citation: Tripod (RRID:SCR_013147) Copy
http://soap.genomics.org.cn/SOAPdenovo-Trans.html
A de novo transcriptome assembler basing on the SOAPdenovo framework, adapt to alternative splicing and different expression level among transcripts., THIS RESOURCE IS NO LONGER IN SERVICE. Documented on September 16,2025.
Proper citation: SOAPdenovo-Trans (RRID:SCR_013268) Copy
http://compbio.cs.ucr.edu/brat/
BRAT is an accurate and efficient tool for mapping short bisulfite-treated reads obtained from the Solexa-Illumina Genome Analyzer.
Proper citation: BRAT (RRID:SCR_013159) Copy
http://alumni.cs.ucr.edu/~liw/cem.html
An algorithm to assemble transcripts and estimate their expression levels from RNA-Seq reads.
Proper citation: CEM (RRID:SCR_013241) Copy
http://qccpack.sourceforge.net
QccPack provides an open-source collection of library routines and utility programs for quantization, compression, and coding of data. QccPack has been written to provide very flexible and general implementations of procedures commonly used in coding and compression applications. QccPack is intended for use in the development of prototypes of coding and compression systems, and in academic research. QccPack includes routines for entropy coding, scalar quantization, vector quantization, and wavelet transforms. Additonally, an open-source implementation of SPIHT is available as an optional module. QccPack provides an open-source collection of library routines and utility programs for quantization, compression, and coding of data. QccPack has been written to provide very flexible and general implementations of procedures commonly used in coding and compression applications. The essential component of the QccPack collection is a library (a static library, libQccPack.a, and, if supported on your system, a dynamic library, libQccPack.so) of procedures implementing a large variety of compression and coding algorithms. Application programs may make use of the QccPack library routines by linking the application against the library during compilation. Each library function is very general in its implementation so to be useful in a large variety of applications. Additionally, much of the functionality of the library routines has been provided in the form of stand-alone executable programs. Probably the prime importance of these utility programs is that they provide examples of how to interface with many of the QccPack library routines. The utility programs could also be called from scripts to simulate the operation of complex coding and compression systems before implementing all the system functionality into one stand-alone program. Currently, QccPack consists of over 55,000 lines of C code implementing over 500 library routines and over 50 stand-alone utility programs. The major functionalities currently implemented include: * Entropy coding o Arithmetic coding including multiple-context adaptive and nonadaptive models o Huffman coding o Golomb and adaptive Golomb coding * Scalar Quantization (SQ) o Uniform SQ o Dead-zone SQ o -law and A-law SQ o Lloyd algorithm for optimal SQ design * Vector quantization (VQ) o Generalized Lloyd algorithm (GLA) for VQ-codebook design o Full-search VQ encoding and decoding o Entropy-constrained-VQ (ECVQ) training, encoding, and decoding o Multistage VQ (MSVQ) (also called residual VQ (RVQ)) training, encoding, and decoding * Adaptive vector quantization (AVQ) o The generalized-threshold-replenishment (GTR) algorithm o The Paul algorithm o Gersho-Yano algorithm o Coding of side information * Wavelet transforms, wavelet-based subband coding o Discrete wavelet transform (DWT) using first-generation filter banks and popular orthonormal and biorthogonal wavelets o Lifting implementations of DWT for popular wavelets o Two-dimensional DWT in the form of dyadic subband pyramids o Three-dimensional DWT in the form of dyadic subband pyramids as well as a packet transform o Shape-adaptive DWT (SA-DWT) for 1D and 2D signals o Redundant DWT (RDWT), aka, the algorithme trous o The SR algorithm for wavelet-based image coding o The SFQ algorithm for wavelet-based image coding o The WDR algorithm for wavelet-based image coding o The 3D-WDR algorithm for wavelet-based image-cube coding o The tarp-filter algorithm for wavelet-based image coding o The 3D-tarp algorithm for wavelet-based image-cube coding o The TCE algorithm for wavelet-based image coding o The BISK algorithm for wavelet-based shape-adaptive image coding o The 3D-BISK algorithm for wavelet-based image-cube coding * Error-correcting codes o Field arithmetic, including Gaussian-elimination matrix inversion o Reed-Solomon encoding and decoding o CRC codes o Trellis codes o Hard and soft Viterbi decoding * Image processing o Routines for reading and writing gray and color still images and sequences of images (via PGM and PPM formats) o Routines for reading and writing 3D image-cube volumes o Image and image-sequence deinterlacing o Image differential-pulse-code modulation (DPCM) o Color-space conversions: RGB, YUV, CIE XYZ, CIE UCS, CIE modified UCS o Block-based DCT and inverse DCT * Video coding o The spatial-block algorithm for image-sequence coding o The RDWT-block algorithm for image-sequence coding o The RWMH algorithm for image-sequence coding o Block-based motion estimation and motion compensation o Motion estimation and motion compensation using regular triangle meshes o Encoding and decoding of motion-vector fields * General routines o Vector math (up/down sampling, sorting, dot product, addition/subtraction, etc.) o Matrix math (addition/subtraction, vector-matrix multiplication, etc.) o Linked lists and associated operations o Entropy estimation (first and second order) o General file input and output, including automatic detection and reading/writing of gzip-compressed files o Character bit-packing for binary bitstream input/output o Memory-based fifo for binary bitstreams o Conversion between various file formats used by library routines o Error-message tracking, formatting, and output o Automatic command-line parameter parsing In addition to the standard functionalities listed above, there exist optional modules that can be added to the QccPack library. Usually, these modules are available under licensing terms different from the GPL/LGPL licenses of QccPack and may contain patented algorithms; refer to the documentation included with each module for specific details. These modules are downloaded separately from QccPack and are not enabled by default during the building of QccPack. The currently available optional modules and their functionalities are: * QccPackSPIHT o The Set Partitioning in Hierarchical Trees (SPIHT) algorithm for wavelet-based image coding * QccPackSPECK o The Set-Partitioning Embedded Block (SPECK) algorithm for wavelet-based image coding Abstract: We describe the QccPack software package, an open-source collection of library routines and utility programs for quantization, compression, and coding of data. QccPack is being written to expedite data-compression research and development by providing general and reliable implementations of common compression techniques. Functionality of the current release includes entropy coding, scalar quantization, vector quantization, adaptive vector quantization, wavelet transforms and subband coding, error-correcting codes, image-processing support, and general vector-math, matrix-math, file-I/O, and error-message routines. All QccPack functionality is accessible via library calls; additionally, many utility programs provide command-line access. The QccPack software package, downloadable free of charge from the QccPack Web page, is published under the terms of the GNU General Public License and the GNU Library General Public License which guarantee source-code access and as well as allow redistribution and modification. Additionally, there exist optional modules that implement certain patented algorithms. These modules are downloadable separately and are typically issued under licenses that permit only non-commercial use. This material is based upon work supported in part by the National Science Foundation under Grant No. INT-9600260.
Proper citation: QccPack (RRID:SCR_013240) Copy
Can't find your Tool?
We recommend that you click next to the search bar to check some helpful tips on searches and refine your search firstly. Alternatively, please register your tool with the SciCrunch Registry by adding a little information to a web form, logging in will enable users to create a provisional RRID, but it not required to submit.
Welcome to the NIF Resources search. From here you can search through a compilation of resources used by NIF and see how data is organized within our community.
You are currently on the Community Resources tab looking through categories and sources that NIF has compiled. You can navigate through those categories from here or change to a different tab to execute your search through. Each tab gives a different perspective on data.
If you have an account on NIF then you can log in from here to get additional features in NIF such as Collections, Saved Searches, and managing Resources.
Here is the search term that is being executed, you can type in anything you want to search for. Some tips to help searching:
You can save any searches you perform for quick access to later from here.
We recognized your search term and included synonyms and inferred terms along side your term to help get the data you are looking for.
If you are logged into NIF you can add data records to your collections to create custom spreadsheets across multiple sources of data.
Here are the sources that were queried against in your search that you can investigate further.
Here are the categories present within NIF that you can filter your data on
Here are the subcategories present within this category that you can filter your data on
If you have any further questions please check out our FAQs Page to ask questions and see our tutorials. Click this button to view this tutorial again.