Searching the RRID Resource Information Network

Our searching services are busy right now. Please try again later

  • Register
X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

X

Leaving Community

Are you sure you want to leave this community? Leaving the community will revoke any permissions you have been granted in this community.

No
Yes
X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

SciCrunch Registry is a curated repository of scientific resources, with a focus on biomedical resources, including tools, databases, and core facilities - visit SciCrunch to register your resource.

Search

Type in a keyword to search

On page 65 showing 1281 ~ 1300 out of 26,151 results
Snippet view Table view Download Top 1000 Results
Click the to add this resource to a Collection
  • RRID:SCR_013268

    This resource has 100+ mentions.

http://soap.genomics.org.cn/SOAPdenovo-Trans.html

A de novo transcriptome assembler basing on the SOAPdenovo framework, adapt to alternative splicing and different expression level among transcripts., THIS RESOURCE IS NO LONGER IN SERVICE. Documented on September 16,2025.

Proper citation: SOAPdenovo-Trans (RRID:SCR_013268) Copy   


  • RRID:SCR_013159

    This resource has 50+ mentions.

http://compbio.cs.ucr.edu/brat/

BRAT is an accurate and efficient tool for mapping short bisulfite-treated reads obtained from the Solexa-Illumina Genome Analyzer.

Proper citation: BRAT (RRID:SCR_013159) Copy   


  • RRID:SCR_013241

    This resource has 1+ mentions.

http://alumni.cs.ucr.edu/~liw/cem.html

An algorithm to assemble transcripts and estimate their expression levels from RNA-Seq reads.

Proper citation: CEM (RRID:SCR_013241) Copy   


  • RRID:SCR_013240

    This resource has 1+ mentions.

http://qccpack.sourceforge.net

QccPack provides an open-source collection of library routines and utility programs for quantization, compression, and coding of data. QccPack has been written to provide very flexible and general implementations of procedures commonly used in coding and compression applications. QccPack is intended for use in the development of prototypes of coding and compression systems, and in academic research. QccPack includes routines for entropy coding, scalar quantization, vector quantization, and wavelet transforms. Additonally, an open-source implementation of SPIHT is available as an optional module. QccPack provides an open-source collection of library routines and utility programs for quantization, compression, and coding of data. QccPack has been written to provide very flexible and general implementations of procedures commonly used in coding and compression applications. The essential component of the QccPack collection is a library (a static library, libQccPack.a, and, if supported on your system, a dynamic library, libQccPack.so) of procedures implementing a large variety of compression and coding algorithms. Application programs may make use of the QccPack library routines by linking the application against the library during compilation. Each library function is very general in its implementation so to be useful in a large variety of applications. Additionally, much of the functionality of the library routines has been provided in the form of stand-alone executable programs. Probably the prime importance of these utility programs is that they provide examples of how to interface with many of the QccPack library routines. The utility programs could also be called from scripts to simulate the operation of complex coding and compression systems before implementing all the system functionality into one stand-alone program. Currently, QccPack consists of over 55,000 lines of C code implementing over 500 library routines and over 50 stand-alone utility programs. The major functionalities currently implemented include: * Entropy coding o Arithmetic coding including multiple-context adaptive and nonadaptive models o Huffman coding o Golomb and adaptive Golomb coding * Scalar Quantization (SQ) o Uniform SQ o Dead-zone SQ o -law and A-law SQ o Lloyd algorithm for optimal SQ design * Vector quantization (VQ) o Generalized Lloyd algorithm (GLA) for VQ-codebook design o Full-search VQ encoding and decoding o Entropy-constrained-VQ (ECVQ) training, encoding, and decoding o Multistage VQ (MSVQ) (also called residual VQ (RVQ)) training, encoding, and decoding * Adaptive vector quantization (AVQ) o The generalized-threshold-replenishment (GTR) algorithm o The Paul algorithm o Gersho-Yano algorithm o Coding of side information * Wavelet transforms, wavelet-based subband coding o Discrete wavelet transform (DWT) using first-generation filter banks and popular orthonormal and biorthogonal wavelets o Lifting implementations of DWT for popular wavelets o Two-dimensional DWT in the form of dyadic subband pyramids o Three-dimensional DWT in the form of dyadic subband pyramids as well as a packet transform o Shape-adaptive DWT (SA-DWT) for 1D and 2D signals o Redundant DWT (RDWT), aka, the algorithme trous o The SR algorithm for wavelet-based image coding o The SFQ algorithm for wavelet-based image coding o The WDR algorithm for wavelet-based image coding o The 3D-WDR algorithm for wavelet-based image-cube coding o The tarp-filter algorithm for wavelet-based image coding o The 3D-tarp algorithm for wavelet-based image-cube coding o The TCE algorithm for wavelet-based image coding o The BISK algorithm for wavelet-based shape-adaptive image coding o The 3D-BISK algorithm for wavelet-based image-cube coding * Error-correcting codes o Field arithmetic, including Gaussian-elimination matrix inversion o Reed-Solomon encoding and decoding o CRC codes o Trellis codes o Hard and soft Viterbi decoding * Image processing o Routines for reading and writing gray and color still images and sequences of images (via PGM and PPM formats) o Routines for reading and writing 3D image-cube volumes o Image and image-sequence deinterlacing o Image differential-pulse-code modulation (DPCM) o Color-space conversions: RGB, YUV, CIE XYZ, CIE UCS, CIE modified UCS o Block-based DCT and inverse DCT * Video coding o The spatial-block algorithm for image-sequence coding o The RDWT-block algorithm for image-sequence coding o The RWMH algorithm for image-sequence coding o Block-based motion estimation and motion compensation o Motion estimation and motion compensation using regular triangle meshes o Encoding and decoding of motion-vector fields * General routines o Vector math (up/down sampling, sorting, dot product, addition/subtraction, etc.) o Matrix math (addition/subtraction, vector-matrix multiplication, etc.) o Linked lists and associated operations o Entropy estimation (first and second order) o General file input and output, including automatic detection and reading/writing of gzip-compressed files o Character bit-packing for binary bitstream input/output o Memory-based fifo for binary bitstreams o Conversion between various file formats used by library routines o Error-message tracking, formatting, and output o Automatic command-line parameter parsing In addition to the standard functionalities listed above, there exist optional modules that can be added to the QccPack library. Usually, these modules are available under licensing terms different from the GPL/LGPL licenses of QccPack and may contain patented algorithms; refer to the documentation included with each module for specific details. These modules are downloaded separately from QccPack and are not enabled by default during the building of QccPack. The currently available optional modules and their functionalities are: * QccPackSPIHT o The Set Partitioning in Hierarchical Trees (SPIHT) algorithm for wavelet-based image coding * QccPackSPECK o The Set-Partitioning Embedded Block (SPECK) algorithm for wavelet-based image coding Abstract: We describe the QccPack software package, an open-source collection of library routines and utility programs for quantization, compression, and coding of data. QccPack is being written to expedite data-compression research and development by providing general and reliable implementations of common compression techniques. Functionality of the current release includes entropy coding, scalar quantization, vector quantization, adaptive vector quantization, wavelet transforms and subband coding, error-correcting codes, image-processing support, and general vector-math, matrix-math, file-I/O, and error-message routines. All QccPack functionality is accessible via library calls; additionally, many utility programs provide command-line access. The QccPack software package, downloadable free of charge from the QccPack Web page, is published under the terms of the GNU General Public License and the GNU Library General Public License which guarantee source-code access and as well as allow redistribution and modification. Additionally, there exist optional modules that implement certain patented algorithms. These modules are downloadable separately and are typically issued under licenses that permit only non-commercial use. This material is based upon work supported in part by the National Science Foundation under Grant No. INT-9600260.

Proper citation: QccPack (RRID:SCR_013240) Copy   


  • RRID:SCR_013243

http://bioinfo.au.tsinghua.edu.cn/seqsite/

Software for detecting transcription factor binding sites from ChIP-seq data.

Proper citation: SeqSite (RRID:SCR_013243) Copy   


  • RRID:SCR_013242

    This resource has 10+ mentions.

http://www.bioconductor.org/packages/2.9/bioc/html/Repitools.html

Software tools for the analysis of enrichment-based epigenomic data.

Proper citation: Repitools (RRID:SCR_013242) Copy   


  • RRID:SCR_013245

    This resource has 1+ mentions.

http://www.bork.embl.de/software/smash/

A stand-alone metagenomic annotation and analysis pipeline suitable for data from Sanger and 454 sequencing technologies.

Proper citation: SmashCommunity (RRID:SCR_013245) Copy   


  • RRID:SCR_013249

    This resource has 10+ mentions.

http://www.ncbi.nlm.nih.gov/books/NBK25500/

Entrez Programming Utilities are tools that provide access to Entrez data outside of the regular web query interface and may be helpful for retrieving search results for future use in another environment.
Additional information is available in the NCBI Bookshelf Short Courses Building Customized Data Pipelines Using the Entrez Programming Utilities (eUtils) and the NCBI PowerScripting course.
User Requirements: Please read for important information on scripting NCBI servers.
EInfo: Provides field index term counts, last update, and available links for each database.
ESearch: Searches and retrieves primary IDs (for use in EFetch, ELink, and ESummary) and term translations and optionally retains results for future use in the user''s environment.
EPost: Posts a file containing a list of primary IDs for future use in the user''s environment to use with subsequent search strategies.
ESummary: Retrieves document summaries from a list of primary IDs or from the user''s environment.
EFetch: Retrieves records in the requested format from a list of one or more primary IDs or from the user''s environment.
ELink: Checks for the existence of an external or Related Articles link from a list of one or more primary IDs. Retrieves primary IDs and relevancy scores for links to Entrez databases or Related Articles; creates a hyperlink to the primary LinkOut provider for a specific ID and database, or lists LinkOut URLs and Attributes for multiple IDs.
EGQuery: Provides Entrez database counts in XML for a single search using Global Query.
ESpell: Retrieves spelling suggestions.
SOAP Interface for Entrez Utilities
PMID to PMC ID Converter
Entrez DTDs
Demonstration Program
Announcement Mailing List
Leasing Data from the National Library of Medicine
Help Desk
User Requirements
Do not overload NCBI''s systems. Users intending to send numerous queries and/or retrieve large numbers of records from Entrez should comply with the following:
Run retrieval scripts on weekends or between 9 pm and 5 am Eastern Time weekdays for any series of more than 100 requests.
Send E-utilities requests to http://eutils.ncbi.nlm.nih.gov, not the standard NCBI Web address.
Make no more than 3 requests every 1 second.
Use the URL parameter email, and tool for distributed software, so that we can track your project and contact you if there is a problem.
NCBI''s Disclaimer and Copyright notice must be evident to users of your service. NLM does not claim the copyright on the abstracts in PubMed; however, journal publishers or authors may. NLM provides no legal advice concerning distribution of copyrighted materials, consult your legal counsel.

Proper citation: Entrez Utilities (RRID:SCR_013249) Copy   


http://www.softpedia.com/get/Science-CAD/BrainVisa-Morphology-extensions.shtml

An extension projects providing computational tools for performing regional morphological measurements to assess groupwise differences and track morphological changes during maturation and aging. The extensions include computation of regional GM thickness, 3D gyrification index, sulcal lenght and depth and sulcal span. These tools are distributed in the form of plugins for a popular analysis package BrainVisa

Proper citation: BrainVisa Morphology extensions (RRID:SCR_013248) Copy   


  • RRID:SCR_013257

    This resource has 1+ mentions.

http://sourceforge.net/projects/mirseq/files/

An R/Bioconductor based workflow for novel miRNA prediction from deep sequencing data.

Proper citation: miRSeqNovel (RRID:SCR_013257) Copy   


  • RRID:SCR_013095

    This resource has 50+ mentions.

http://sourceforge.net/projects/ligmap/files/

A tool for structural biology and drug design.

Proper citation: AutoMap (RRID:SCR_013095) Copy   


  • RRID:SCR_013217

http://sourceforge.net/projects/dynamicprog/

A model-based statistical methods for base calling in Illumina''s next-generation sequencing platforms.

Proper citation: DynamicProg (RRID:SCR_013217) Copy   


  • RRID:SCR_013188

http://www-rcf.usc.edu/~fsun/Programs/multiAlignFree/multiAlignFreemain.html

R package intended to implement a program for multiple alignment-free sequence comparison based on long genome sequence or NGS data.

Proper citation: muliAlignFree (RRID:SCR_013188) Copy   


http://www.usphs.gov/

Commissioned Corps of the United States Public Health Service, is the federal uniformed service of the U.S. Public Health Service, and is one of the eight uniformed services of the United States.

Proper citation: U.S. Public Health Service Commissioned Corps (RRID:SCR_013104) Copy   


http://www.cdc.gov/niosh/oep/funding.html

http://www.cdc.gov/niosh/oep/funding.html

Proper citation: National Institute for Occupational Safety and Health (RRID:SCR_013180) Copy   


http://www.bioconductor.org/packages//2.10/bioc/html/CancerMutationAnalysis.html

Software package that implements gene and gene-set level analysis methods for somatic mutation studies of cancer.

Proper citation: CancerMutationAnalysis (RRID:SCR_013181) Copy   


  • RRID:SCR_013229

    This resource has 10+ mentions.

http://beads.sourceforge.net/

Software for a normalization scheme that corrects nucleotide composition bias, mappability variations and differential local DNA structural effects in deep sequencing data.

Proper citation: BEADS (RRID:SCR_013229) Copy   


  • RRID:SCR_013234

    This resource has 100+ mentions.

http://www.lifetechnologies.com/fr/fr/home/technical-resources/software-downloads/lifescope-genomic-analysis-software.html

Genomic Analysis Software designed to match the accuracy of the next generation 5500 Genetic Analyzers with Exact Call Chemistry (ECC).

Proper citation: LifeScope (RRID:SCR_013234) Copy   


  • RRID:SCR_013190

    This resource has 1+ mentions.

http://sourceforge.net/projects/congrpe/

A de novo assembly algorithm for Next-Generation Sequencing technology.

Proper citation: CongrPE (RRID:SCR_013190) Copy   


  • RRID:SCR_013192

http://sourceforge.net/projects/callsim/

A software application that provides evidence for the validity of base calls believed to be sequencing errors and it is applicable to Ion Torrent and 454 data.

Proper citation: CallSim (RRID:SCR_013192) Copy   



Can't find your Tool?

We recommend that you click next to the search bar to check some helpful tips on searches and refine your search firstly. Alternatively, please register your tool with the SciCrunch Registry by adding a little information to a web form, logging in will enable users to create a provisional RRID, but it not required to submit.

Can't find the RRID you're searching for? X
  1. Neuroscience Information Framework Resources

    Welcome to the NIF Resources search. From here you can search through a compilation of resources used by NIF and see how data is organized within our community.

  2. Navigation

    You are currently on the Community Resources tab looking through categories and sources that NIF has compiled. You can navigate through those categories from here or change to a different tab to execute your search through. Each tab gives a different perspective on data.

  3. Logging in and Registering

    If you have an account on NIF then you can log in from here to get additional features in NIF such as Collections, Saved Searches, and managing Resources.

  4. Searching

    Here is the search term that is being executed, you can type in anything you want to search for. Some tips to help searching:

    1. Use quotes around phrases you want to match exactly
    2. You can manually AND and OR terms to change how we search between words
    3. You can add "-" to terms to make sure no results return with that term in them (ex. Cerebellum -CA1)
    4. You can add "+" to terms to require they be in the data
    5. Using autocomplete specifies which branch of our semantics you with to search and can help refine your search
  5. Save Your Search

    You can save any searches you perform for quick access to later from here.

  6. Query Expansion

    We recognized your search term and included synonyms and inferred terms along side your term to help get the data you are looking for.

  7. Collections

    If you are logged into NIF you can add data records to your collections to create custom spreadsheets across multiple sources of data.

  8. Sources

    Here are the sources that were queried against in your search that you can investigate further.

  9. Categories

    Here are the categories present within NIF that you can filter your data on

  10. Subcategories

    Here are the subcategories present within this category that you can filter your data on

  11. Further Questions

    If you have any further questions please check out our FAQs Page to ask questions and see our tutorials. Click this button to view this tutorial again.

X