Skip to main content

Quickstart for Enterprise

Enterprises and institutions operating on-chain businesses require SLA guarantees, security certifications, and operational continuity from day one — without building their own infrastructure. Nodit delivers these on a SOC 2-certified foundation with 99.9% SLA, supporting phased expansion from PoC to production.

SLA 99.9% +
High Availability

A managed infrastructure built on a multi-node configuration with smart load balancing that accounts for sync state, automatic failover, and a cluster-based monitoring system. Depending on operational requirements, you can choose between Elastic Node or a Dedicated Node deployed in an isolated environment.

Security & Operations
Security & Compliance

IP/Domain Allowlist, RBAC-based access control, and a Request Log-based audit framework can be configured immediately.

Zero CapEx
Economy & Observability

Start with no upfront infrastructure investment. CU-based real-time usage monitoring, per-project Auto-scaling, and a path to dedicated node deployment enable cost-optimized, phased expansion.

24/7 Monitoring
Operational Continuity

Monitor service status in real time on the Status page and integrate it with your own monitoring systems. Direct inquiry and escalation through a dedicated operations team channel is also supported.

Nodit Usecases for Enterprise

Depending on the nature of the on-chain business, enterprises and institutions can leverage Nodit across two domains: node infrastructure and data services. The two paths can be used independently or combined, and security and audit environment configuration applies to both.

Node InfrastructureData Services
PurposeReal-time blockchain communication and latest data retrievalLeveraging indexed historical on-chain data
ServicesElastic Node, Dedicated NodeWeb3 Data API, Webhook, Stream, Datasquare
Use CasesTransaction submission, smart contract deployment and calls, latest block retrieval, etc.Token holder and transfer history, wallet asset queries, deposit detection, on-chain event notifications, etc.
Getting StartedSign up in the console, create a project, and use the free Elastic Node endpointSign up in the console, issue an API Key, and immediately call APIs or create Webhooks


Quickstart for Node Infrastructure

Test functional requirements with a free Elastic Node, then scale to an optimized Dedicated Node based on operational needs.

Step 1. Testing with Elastic Node

Even for enterprise and institutional accounts, you can start using nodes immediately by signing up at the Nodit Console and creating a project — no separate contract required. Elastic Node is a shared node cluster environment with a guaranteed 99.9% SLA, making it well-suited for the technical validation that must precede internal decision-making.

Key items to test

  • Network coverage — Verify that nodes for the target chain and network (Mainnet/Testnet) are supported on the Supported Chains page or the Chains screen in the console.
  • Node API compatibility — Confirm that the JSON-RPC methods used in your existing systems work correctly on Elastic Node. Validate the response format and processing results for core methods such as eth_call, eth_sendRawTransaction, and eth_getLogs.
  • Performance measurement against traffic patterns — Based on actual service usage, measure API rate limit constraints, successful response counts at peak request frequency, and average response times. These measurements serve as the basis for plan selection and Dedicated Node specification sizing at transition.
API Reference

API integration methods are covered in Quickstart for Developers. Authentication methods and call structure are described in API Overview.

Step 2. Deciding Whether to Transition to Production

After PoC validation, determine whether to remain on Elastic Node or transition to Dedicated Node based on your operational requirements.

CriteriaStay on Elastic NodeRecommended: Transition to Dedicated Node
Traffic volumeWithin per-plan request limits with Auto-scale at peakExceeds plan limits or sharp growth is expected
Infrastructure isolationShared environment is acceptableComplete separation from other users is required
Security auditStandard access control is sufficientEvidence of dedicated infrastructure operation is required
SLA99.9%99.9% or higher guaranteed via contract
Incident responseStatus page + public channelsCommunication through a dedicated account manager channel

Step 3. (Optional) Adopting Dedicated Node

Dedicated Node is a node service running on isolated, dedicated infrastructure. Based on the performance requirements identified during the PoC phase, it enables you to build a node infrastructure and network environment optimized for your service.

Adoption process

1
Requirements Consultation

Discuss the networks, expected traffic volume, and security requirements. Sharing traffic pattern data measured during the PoC accelerates specification sizing.

2
Infrastructure Design and Quotation

Dedicated node specifications and costs are determined based on the number of networks, concurrent request volume, and data retention period.

3
Provisioning and Migration

The node is deployed in an isolated environment, and existing endpoints and security settings are migrated.

4
Operations and Ongoing Support

A dedicated account manager supports technical issues and specification changes. Capacity expansion in response to traffic growth can be requested while in operation.

Full details are available in the Dedicated Node guide.


Quickstart for Data Service

Query indexed historical on-chain data or integrate event-based notifications into internal systems. This path operates independently of node infrastructure and can be used separately from or alongside the node infrastructure quickstart above.

Step 1. Issuing an API Key and Testing Data Retrieval

Create a project in the Nodit Console and issue an API Key to immediately call Web3 Data API. Indexed on-chain data is accessible through REST API calls alone — no separate infrastructure setup or data pipeline configuration is required.

Key items to test

  • Network coverage — Verify that nodes for the target chain and network (Mainnet/Testnet) are supported on the Supported Chains page or the Chains screen in the console.
  • API coverage verification — Check the Web3 Data API guide and the Chains screen to confirm that an API exists for the data domain you need (token balances, transfer history, NFT metadata, etc.) and the retrieval criteria you require (account address, transaction hash, block range, etc.). Also review the response field structure and data integrity to determine whether it can be mapped to your existing system's data model.
  • Data freshness validation — Measure whether the latency between an on-chain event occurring and that data being reflected in the indexed API response meets your service requirements. Reflection speed varies by chain and network characteristics, so trigger an actual transaction and verify the point at which it appears in the API response.
API Reference

All Web3 Data API methods and response schemas are available in the Web3 Data API guide.

Step 2. Selecting a Service Type by Use Case

Data services are categorized into four types based on their data access pattern. They can be used individually or in combination depending on your service requirements.

ServiceData Access PatternUse CaseGetting Started
Web3 Data APIPull — query via REST API when neededAsset dashboards, portfolio queries, AML fund tracing, regulatory audit reportingWeb3 Data API
WebhookPush — receive HTTP callbacks when events occurAutomated deposit detection, compliance threshold alerts, real-time settlement triggersWebhook
StreamPush — real-time delivery over persistent WebSocket connectionReal-time data pipelines, risk monitoring, on-chain analyticsStream
DatasquareQuery — SQL-based on-chain data analysisCustom analytics queries, cross-chain data aggregation, business reportsDatasquare

Pull vs. Push selection criteria

  • Querying data at specific points in time (balance checks, history extraction) → Web3 Data API
  • Reflecting events in internal systems immediately upon occurrence (deposit detection, notifications) → Webhook
  • Continuous per-block data streaming (analytics engines, risk systems) → Stream
  • Immediate loading of large-scale blockchain data without running a pipeline, or custom query-based analysis (analytics engines, custom data processing) → Datasquare

Step 3. Integrating with the Production Environment

Review the following items when applying data services to a production environment.

Web3 Data API

  • Pagination handling — For large-volume data retrieval, apply cursor-based pagination to reliably paginate through the full result set.
  • Rate limit verification — Compare per-plan request limits against actual call patterns to identify potential bottlenecks in advance. Per-plan limits are available in Rate Limits.

Webhook

  • Signing Key validation — Verify message origin and integrity using the Signing Key when receiving Webhooks. This is required for external endpoint integrations.
  • Delivery failure handling — An automatic retry policy (Retry/Backoff) is applied, and delivery failure history is available in Delivery History. Manual resending is also possible via Easy Resend.
  • Flexible Webhook — Define fine-grained conditions using CEL expressions and select output fields to receive only the data you need. Use Live Sample and Test Webhook to validate rules before going to production.

Stream

  • Connection stability — Implement automatic reconnect logic on the client side for WebSocket disconnections. Verify whether any data is lost at the point of reconnection and handle accordingly.
  • Event filtering — Configure the event types to receive in advance to reduce unnecessary data processing overhead. Supported events are listed in Stream Event Type.

Security and Operational Setup

For both node infrastructure and data services, the security and audit environment required for enterprise operations can be configured in the console. Security configurations established at this stage are preserved when services are expanded or when transitioning to Dedicated Node.

Access Control

API call origins can be restricted by Domain Name and Source IP. When the Allowlist is activated, requests from unregistered origins are blocked with an HTTP 403 error.

  • Domain Name Allowlist — Used to restrict requests originating from web browsers.
  • Source IP Allowlist — Used in fixed network environments such as server-to-server communication.

Configuration details are available in the Security guide.

Team Account (RBAC)

Converting to a team account lets multiple team members share project resources (API Keys, Webhooks, Request Logs, etc.). Permissions are managed through a separation of Team Owner and Team Member roles. Configuration details are available in the Team Account guide.

Request Logs

View up to 7 days of API call history and detailed results in the console. Filter by network, time range, HTTP status, error code, and other criteria for debugging and audit evidence. Usage details are available in the Request Logs guide.


Next Steps

The following actions can be started immediately in the Nodit Console.

  1. Create a project in the Nodit Console and connect Elastic Node.
  2. Go to Settings > API Keys and configure an IP Allowlist to establish access control.
  3. Go to Settings > Team to invite team members and assign roles.
  4. Verify the integration status by reviewing API call history in Request Logs.

Contact

For Dedicated Node adoption or other enterprise requirements, reach out through the following channels.